SlideShare a Scribd company logo
Ahmad Aljebaly
Artificial Neural Networks
Agenda
History of Artificial Neural Networks
What is an Artificial Neural Networks?
How it works?
Learning
Learning paradigms
Supervised learning
Unsupervised learning
Reinforcement learning
 Applications areas
Advantages and Disadvantages
History of the Artificial Neural Networks
 history of the ANNs stems from the 1940s, the decade of the first electronic
computer.
 However, the first important step took place in 1957 when Rosenblatt
introduced the first concrete neural model, the perceptron. Rosenblatt also
took part in constructing the first successful neurocomputer, the Mark I
Perceptron. After this, the development of ANNs has proceeded as
described in Figure.
History of the Artificial Neural Networks
 Rosenblatt's original perceptron model contained only one layer. From this,
a multi-layered model was derived in 1960. At first, the use of the multi-
layer perceptron (MLP) was complicated by the lack of a appropriate
learning algorithm.
 In 1974, Werbos came to introduce a so-called backpropagation algorithm
for the three-layered perceptron network.
History of the Artificial Neural Networks
 in 1986, The application area of the MLP networks remained rather limited
until the breakthrough when a general back propagation algorithm for a
multi-layered perceptron was introduced by Rummelhart and Mclelland.
 in 1982, Hopfield brought out his idea of a neural network. Unlike the
neurons in MLP, the Hopfield network consists of only one layer whose
neurons are fully connected with each other.
History of the Artificial Neural Networks
 Since then, new versions of the Hopfield network have been developed.
The Boltzmann machine has been influenced by both the Hopfield network
and the MLP.
History of the Artificial Neural Networks
 in 1988, Radial Basis Function (RBF) networks were first introduced by
Broomhead & Lowe. Although the basic idea of RBF was developed 30
years ago under the name method of potential function, the work by
Broomhead & Lowe opened a new frontier in the neural network
community.
History of the Artificial Neural Networks
 in 1982, A totally unique kind of network model is the Self-Organizing
Map (SOM) introduced by Kohonen. SOM is a certain kind of topological
map which organizes itself based on the input patterns that it is trained with.
The SOM originated from the LVQ (Learning Vector Quantization)
network the underlying idea of which was also Kohonen's in 1972.
History of Artificial Neural Networks
Since then, research on artificial neural networks has
remained active, leading to many new network types, as
well as hybrid algorithms and hardware for neural
information processing.
Artificial Neural Network
An artificial neural network consists of a pool of simple
processing units which communicate by sending signals to
each other over a large number of weighted connections.
Artificial Neural Network
A set of major aspects of a parallel distributed model include:
 a set of processing units (cells).
 a state of activation for every unit, which equivalent to the output of the
unit.
 connections between the units. Generally each connection is defined by a
weight.
 a propagation rule, which determines the effective input of a unit from its
external inputs.
 an activation function, which determines the new level of activation based
on the effective input and the current activation.
 an external input for each unit.
 a method for information gathering (the learning rule).
 an environment within which the system must operate, providing input
signals and _ if necessary _ error signals.
Computers vs. Neural Networks
“Standard” Computers Neural Networks
 one CPU highly parallel processing
fast processing units slow processing units
reliable units unreliable units
static infrastructure dynamic infrastructure
Why Artificial Neural Networks?
There are two basic reasons why we are interested in
building artificial neural networks (ANNs):
• Technical viewpoint: Some problems such as
character recognition or the prediction of future
states of a system require massively parallel and
adaptive processing.
• Biological viewpoint: ANNs can be used to
replicate and simulate components of the human
(or animal) brain, thereby giving us insight into
natural information processing.
Artificial Neural Networks
• The “building blocks” of neural networks are the
neurons.
• In technical systems, we also refer to them as units or nodes.
• Basically, each neuron
 receives input from many other neurons.
 changes its internal state (activation) based on the current
input.
 sends one output signal to many other neurons, possibly
including its input neurons (recurrent network).
Artificial Neural Networks
• Information is transmitted as a series of electric
impulses, so-called spikes.
• The frequency and phase of these spikes encodes the
information.
• In biological systems, one neuron can be connected to as
many as 10,000 other neurons.
• Usually, a neuron receives its information from other
neurons in a confined area, its so-called receptive field.
How do ANNs work?
An artificial neural network (ANN) is either a hardware
implementation or a computer program which strives to
simulate the information processing capabilities of its biological
exemplar. ANNs are typically composed of a great number of
interconnected artificial neurons. The artificial neurons are
simplified models of their biological counterparts.
ANN is a technique for solving problems by constructing software
that works like our brains.
How do our brains work?
 The Brain is A massively parallel information processing system.
 Our brains are a huge network of processing elements. A typical brain contains a
network of 10 billion neurons.
How do our brains work?
 A processing element
Dendrites: Input
Cell body: Processor
Synaptic: Link
Axon: Output
How do our brains work?
 A processing element
A neuron is connected to other neurons through about 10,000
synapses
How do our brains work?
 A processing element
A neuron receives input from other neurons. Inputs are combined.
How do our brains work?
 A processing element
Once input exceeds a critical level, the neuron discharges a spike ‐
an electrical pulse that travels from the body, down the axon, to
the next neuron(s)
How do our brains work?
 A processing element
The axon endings almost touch the dendrites or cell body of the
next neuron.
How do our brains work?
 A processing element
Transmission of an electrical signal from one neuron to the next is
effected by neurotransmitters.
How do our brains work?
 A processing element
Neurotransmitters are chemicals which are released from the first neuron
and which bind to the
Second.
How do our brains work?
 A processing element
This link is called a synapse. The strength of the signal that
reaches the next neuron depends on factors such as the amount of
neurotransmitter available.
How do ANNs work?
An artificial neuron is an imitation of a human neuron
How do ANNs work?
• Now, let us have a look at the model of an artificial neuron.
How do ANNs work?
Output
x1x2xm
∑
y
Processing
Input
∑= X1+X2 + ….+Xm =y
. . . . . . . . . . .
.
How do ANNs work?
Not all inputs are equal
Output
x1x2xm
∑
y
Processing
Input
∑= X1w1+X2w2 + ….+Xmwm
=y
w1w2
wm
weights
. . . . . . . . . . .
.
. . . .
.
How do ANNs work?
The signal is not passed down to the
next neuron verbatim
Transfer Function
(Activation Function)
Output
x1x2xm
∑
y
Processing
Input
w1
w2wm
weights
. . . . . . . . . . .
.
f(vk)
. . . .
.
The output is a function of the input, that is
affected by the weights, and the transfer
functions
Three types of layers: Input, Hidden, and
Output
Artificial Neural Networks
An ANN can:
1. compute any computable function, by the appropriate
selection of the network topology and weights values.
2. learn from experience!
 Specifically, by trial and error‐ ‐
Learning by trial and error‐ ‐
Continuous process of:
Trial:
Processing an input to produce an output (In terms of ANN:
Compute the output function of a given input)
Evaluate:
Evaluating this output by comparing the actual output with
the expected output.
Adjust:
Adjust the weights.
Example: XOR
Hidden
Layer, with
three
neurons
Output
Layer, with
one neuron
Input Layer,
with two
neurons
How it works?
Hidden
Layer, with
three
neurons
Output
Layer, with
one neuron
Input Layer,
with two
neurons
How it works?
Set initial values of the weights randomly.
Input: truth table of the XOR
Do
 Read input (e.g. 0, and 0)
 Compute an output (e.g. 0.60543)
 Compare it to the expected output. (Diff= 0.60543)
 Modify the weights accordingly.
Loop until a condition is met
 Condition: certain number of iterations
 Condition: error threshold
Design Issues
Initial weights (small random values [ 1,1])∈ ‐
Transfer function (How the inputs and the weights are
combined to produce output?)
Error estimation
Weights adjusting
Number of neurons
Data representation
Size of training set
Transfer Functions
Linear: The output is proportional to the total
weighted input.
Threshold: The output is set at one of two values,
depending on whether the total weighted input is
greater than or less than some threshold value.
Non linear:‐ The output varies continuously but not
linearly as the input changes.
Error Estimation
The root mean square error (RMSE) is a
frequently-used measure of the differences between
values predicted by a model or an estimator and the
values actually observed from the thing being
modeled or estimated
Weights Adjusting
After each iteration, weights should be adjusted to
minimize the error.
– All possible weights
– Back propagation
Back Propagation
Back-propagation is an example of supervised learning is
used at each layer to minimize the error between the
layer’s response and the actual data
The error at each hidden layer is an average of the
evaluated error
Hidden layer networks are trained this way
Back Propagation
N is a neuron.
Nw is one of N’s inputs weights
Nout is N’s output.
Nw = Nw +Δ Nw
Δ Nw = Nout * (1‐ Nout)* NErrorFactor
NErrorFactor = NExpectedOutput – NActualOutput
This works only for the last layer, as we can know
the actual output, and the expected output.
Number of neuronsMany neurons:
Higher accuracy
Slower
Risk of over fitting‐
Memorizing, rather than understanding
The network will be useless with new problems.
Few neurons:
 Lower accuracy
Inability to learn at all
Optimal number.
Data representation
Usually input/output data needs pre processing‐
 Pictures
 Pixel intensity
Text:
A pattern
Size of training set
No one fits all formula‐ ‐
Over fitting can occur if a “good” training set is not
chosen
What constitutes a “good” training set?
 Samples must represent the general population.
Samples must contain members of each class.
Samples in each class must contain a wide range of
variations or noise effect.
The size of the training set is related to the number of
hidden neurons
Learning Paradigms
Supervised learning
Unsupervised learning
Reinforcement learning
Supervised learning
This is what we have seen so far!
A network is fed with a set of training samples (inputs and
corresponding output), and it uses these samples to learn
the general relationship between the inputs and the outputs.
This relationship is represented by the values of the weights
of the trained network.
Unsupervised learning
No desired output is associated with the training data!
Faster than supervised learning
Used to find out structures within data:
 Clustering
 Compression
Reinforcement learning
Like supervised learning, but:
Weights adjusting is not directly related to the error
value.
The error value is used to randomly, shuffle weights!
Relatively slow learning due to ‘randomness’.
Applications Areas
Function approximation
including time series prediction and modeling.
Classification
including patterns and sequences recognition, novelty
detection and sequential decision making.
(radar systems, face identification, handwritten text recognition)
Data processing
including filtering, clustering blinds source separation and
compression.
(data mining, e-mail Spam filtering)
Advantages / Disadvantages
Advantages
Adapt to unknown situations
Powerful, it can model complex functions.
Ease of use, learns by example, and very little user
domain specific expertise needed‐
Disadvantages
Forgets
Not exact
Large complexity of the network structure
Conclusion
Artificial Neural Networks are an imitation of the biological
neural networks, but much simpler ones.
The computing would have a lot to gain from neural networks.
Their ability to learn by example makes them very flexible and
powerful furthermore there is need to device an algorithm in
order to perform a specific task.
Conclusion
Neural networks also contributes to area of research such a
neurology and psychology. They are regularly used to model
parts of living organizations and to investigate the internal
mechanisms of the brain.
Many factors affect the performance of ANNs, such as the
transfer functions, size of training sample, network topology,
weights adjusting algorithm, …
References
 Craig Heller, and David Sadava, Life: The Science of Biology, fifth edition,
Sinauer Associates, INC, USA, 1998.
 Introduction to Artificial Neural Networks, Nicolas Galoppo von Borries
 Tom M. Mitchell, Machine Learning, WCB McGraw-Hill, Boston, 1997.
Thank You
Q. How does each neuron work in ANNS?
What is back propagation?
A neuron: receives input from many other neurons;
changes its internal state (activation) based on the
current input;
sends one output signal to many other neurons, possibly
including its input neurons (ANN is recurrent network).
Back-propagation is a type of supervised learning, used at each
layer to minimize the error between the layer’s response and
the actual data.

More Related Content

What's hot

Artificial neural network for machine learning
Artificial neural network for machine learningArtificial neural network for machine learning
Artificial neural network for machine learninggrinu
 
Artificial-Neural-Networks.ppt
Artificial-Neural-Networks.pptArtificial-Neural-Networks.ppt
Artificial-Neural-Networks.pptChidanGowda1
 
Image Processing Introduction
Image Processing IntroductionImage Processing Introduction
Image Processing IntroductionAhmed Gad
 
Nural network ER. Abhishek k. upadhyay
Nural network ER. Abhishek  k. upadhyayNural network ER. Abhishek  k. upadhyay
Nural network ER. Abhishek k. upadhyayabhishek upadhyay
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)EdutechLearners
 
Feed Forward Neural Network.pptx
Feed Forward Neural Network.pptxFeed Forward Neural Network.pptx
Feed Forward Neural Network.pptxNoorFathima60
 
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Simplilearn
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANNMohamed Talaat
 
Performance Metrics for Machine Learning Algorithms
Performance Metrics for Machine Learning AlgorithmsPerformance Metrics for Machine Learning Algorithms
Performance Metrics for Machine Learning AlgorithmsKush Kulshrestha
 
Artifical Neural Network and its applications
Artifical Neural Network and its applicationsArtifical Neural Network and its applications
Artifical Neural Network and its applicationsSangeeta Tiwari
 
Training Neural Networks
Training Neural NetworksTraining Neural Networks
Training Neural NetworksDatabricks
 
Artificial Neural Networks for NIU session 2016 17
Artificial Neural Networks for NIU session 2016 17 Artificial Neural Networks for NIU session 2016 17
Artificial Neural Networks for NIU session 2016 17 Prof. Neeta Awasthy
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkNainaBhatt1
 
Artificial Neural Network Abstract
Artificial Neural Network AbstractArtificial Neural Network Abstract
Artificial Neural Network AbstractAnjali Agrawal
 

What's hot (20)

Artificial neural network for machine learning
Artificial neural network for machine learningArtificial neural network for machine learning
Artificial neural network for machine learning
 
Artificial-Neural-Networks.ppt
Artificial-Neural-Networks.pptArtificial-Neural-Networks.ppt
Artificial-Neural-Networks.ppt
 
Image Processing Introduction
Image Processing IntroductionImage Processing Introduction
Image Processing Introduction
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Nural network ER. Abhishek k. upadhyay
Nural network ER. Abhishek  k. upadhyayNural network ER. Abhishek  k. upadhyay
Nural network ER. Abhishek k. upadhyay
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)
 
Feed Forward Neural Network.pptx
Feed Forward Neural Network.pptxFeed Forward Neural Network.pptx
Feed Forward Neural Network.pptx
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Deep learning
Deep learningDeep learning
Deep learning
 
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
 
Artificial Neural Networks - ANN
Artificial Neural Networks - ANNArtificial Neural Networks - ANN
Artificial Neural Networks - ANN
 
Perceptron in ANN
Perceptron in ANNPerceptron in ANN
Perceptron in ANN
 
Performance Metrics for Machine Learning Algorithms
Performance Metrics for Machine Learning AlgorithmsPerformance Metrics for Machine Learning Algorithms
Performance Metrics for Machine Learning Algorithms
 
Artifical Neural Network and its applications
Artifical Neural Network and its applicationsArtifical Neural Network and its applications
Artifical Neural Network and its applications
 
Training Neural Networks
Training Neural NetworksTraining Neural Networks
Training Neural Networks
 
Artificial Neural Networks for NIU session 2016 17
Artificial Neural Networks for NIU session 2016 17 Artificial Neural Networks for NIU session 2016 17
Artificial Neural Networks for NIU session 2016 17
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Artificial Neural Network Abstract
Artificial Neural Network AbstractArtificial Neural Network Abstract
Artificial Neural Network Abstract
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 

Viewers also liked

5 g communication
5 g communication5 g communication
5 g communicationBasil John
 
BIZITZAREN HISTORIA ZURE GORPUTZA SM-IKASMINA SAN MARTIN ESKOLAN EGOKITUA
BIZITZAREN  HISTORIA ZURE  GORPUTZA SM-IKASMINA  SAN MARTIN  ESKOLAN EGOKITUABIZITZAREN  HISTORIA ZURE  GORPUTZA SM-IKASMINA  SAN MARTIN  ESKOLAN EGOKITUA
BIZITZAREN HISTORIA ZURE GORPUTZA SM-IKASMINA SAN MARTIN ESKOLAN EGOKITUAJoserra Abarretegui
 
Networks lab manual ecp62
Networks lab manual ecp62Networks lab manual ecp62
Networks lab manual ecp62Basil John
 
Basil john mm field assignment jan2016
Basil john mm field assignment jan2016Basil john mm field assignment jan2016
Basil john mm field assignment jan2016Basil John
 
Chuyen de ptlgiac
Chuyen de ptlgiacChuyen de ptlgiac
Chuyen de ptlgiacMrNgo Ngo
 
Leaders as learning facilitators: learnscapes, prerequisites & field of action
Leaders as learning facilitators: learnscapes, prerequisites & field of actionLeaders as learning facilitators: learnscapes, prerequisites & field of action
Leaders as learning facilitators: learnscapes, prerequisites & field of actionscil CH
 
Jayan - IT Manager, Project Manager, ITSM
Jayan - IT Manager, Project Manager, ITSMJayan - IT Manager, Project Manager, ITSM
Jayan - IT Manager, Project Manager, ITSMJayan Varghese
 
Washing machine
Washing  machineWashing  machine
Washing machineEng Eng
 
Cloud computing
Cloud computingCloud computing
Cloud computingBasil John
 

Viewers also liked (20)

Fuzzy logic
Fuzzy logicFuzzy logic
Fuzzy logic
 
7.chapter
7.chapter7.chapter
7.chapter
 
5 g communication
5 g communication5 g communication
5 g communication
 
BIZITZAREN HISTORIA ZURE GORPUTZA SM-IKASMINA SAN MARTIN ESKOLAN EGOKITUA
BIZITZAREN  HISTORIA ZURE  GORPUTZA SM-IKASMINA  SAN MARTIN  ESKOLAN EGOKITUABIZITZAREN  HISTORIA ZURE  GORPUTZA SM-IKASMINA  SAN MARTIN  ESKOLAN EGOKITUA
BIZITZAREN HISTORIA ZURE GORPUTZA SM-IKASMINA SAN MARTIN ESKOLAN EGOKITUA
 
Networks lab manual ecp62
Networks lab manual ecp62Networks lab manual ecp62
Networks lab manual ecp62
 
Basil john mm field assignment jan2016
Basil john mm field assignment jan2016Basil john mm field assignment jan2016
Basil john mm field assignment jan2016
 
Chuyen de ptlgiac
Chuyen de ptlgiacChuyen de ptlgiac
Chuyen de ptlgiac
 
Biomasa
BiomasaBiomasa
Biomasa
 
Umts
UmtsUmts
Umts
 
Leaders as learning facilitators: learnscapes, prerequisites & field of action
Leaders as learning facilitators: learnscapes, prerequisites & field of actionLeaders as learning facilitators: learnscapes, prerequisites & field of action
Leaders as learning facilitators: learnscapes, prerequisites & field of action
 
Biderketa taulak
Biderketa taulakBiderketa taulak
Biderketa taulak
 
Jayan - IT Manager, Project Manager, ITSM
Jayan - IT Manager, Project Manager, ITSMJayan - IT Manager, Project Manager, ITSM
Jayan - IT Manager, Project Manager, ITSM
 
Washing machine
Washing  machineWashing  machine
Washing machine
 
Business Law
Business LawBusiness Law
Business Law
 
Class7
Class7Class7
Class7
 
Wi max
Wi maxWi max
Wi max
 
Urdaneta comic
Urdaneta comicUrdaneta comic
Urdaneta comic
 
Cloud computing
Cloud computingCloud computing
Cloud computing
 
Evaluation question 3
Evaluation question 3Evaluation question 3
Evaluation question 3
 
9.references
9.references9.references
9.references
 

Similar to Neural networks

BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...DurgadeviParamasivam
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1ncct
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisAdityendra Kumar Singh
 
Introduction to Artificial Neural Network
Introduction to Artificial Neural NetworkIntroduction to Artificial Neural Network
Introduction to Artificial Neural NetworkAmeer H Ali
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networksAkashRanjandas1
 
Artificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxArtificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxpratik610182
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationMohammed Bennamoun
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkPrakash K
 
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance lec 13
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance   lec 13Dr. Syed Muhammad Ali Tirmizi - Special topics in finance   lec 13
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance lec 13Dr. Muhammad Ali Tirmizi., Ph.D.
 
what is neural network....???
what is neural network....???what is neural network....???
what is neural network....???Adii Shah
 
Neural networks of artificial intelligence
Neural networks of artificial  intelligenceNeural networks of artificial  intelligence
Neural networks of artificial intelligencealldesign
 

Similar to Neural networks (20)

ANN.ppt
ANN.pptANN.ppt
ANN.ppt
 
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
 
SoftComputing5
SoftComputing5SoftComputing5
SoftComputing5
 
ANN.pptx
ANN.pptxANN.pptx
ANN.pptx
 
ANN - UNIT 1.pptx
ANN - UNIT 1.pptxANN - UNIT 1.pptx
ANN - UNIT 1.pptx
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1
 
Artificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical DiagnosisArtificial Neural Network in Medical Diagnosis
Artificial Neural Network in Medical Diagnosis
 
Introduction to Artificial Neural Network
Introduction to Artificial Neural NetworkIntroduction to Artificial Neural Network
Introduction to Artificial Neural Network
 
Artificial neural networks
Artificial neural networksArtificial neural networks
Artificial neural networks
 
Artificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxArtificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptx
 
19_Learning.ppt
19_Learning.ppt19_Learning.ppt
19_Learning.ppt
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computation
 
Neural Network
Neural NetworkNeural Network
Neural Network
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Neural networks
Neural networksNeural networks
Neural networks
 
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance lec 13
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance   lec 13Dr. Syed Muhammad Ali Tirmizi - Special topics in finance   lec 13
Dr. Syed Muhammad Ali Tirmizi - Special topics in finance lec 13
 
08 neural networks(1).unlocked
08 neural networks(1).unlocked08 neural networks(1).unlocked
08 neural networks(1).unlocked
 
what is neural network....???
what is neural network....???what is neural network....???
what is neural network....???
 
Neural networks of artificial intelligence
Neural networks of artificial  intelligenceNeural networks of artificial  intelligence
Neural networks of artificial intelligence
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 

More from Basil John

Class 5 ipr trips - trims
Class 5 ipr trips - trimsClass 5 ipr trips - trims
Class 5 ipr trips - trimsBasil John
 
Overcoming communication Barriers
Overcoming communication BarriersOvercoming communication Barriers
Overcoming communication BarriersBasil John
 
AADHAR Card- Database Creation
AADHAR Card- Database CreationAADHAR Card- Database Creation
AADHAR Card- Database CreationBasil John
 
Interworking of wi_max_and_3gpp_networks_-slides
Interworking of wi_max_and_3gpp_networks_-slidesInterworking of wi_max_and_3gpp_networks_-slides
Interworking of wi_max_and_3gpp_networks_-slidesBasil John
 
Wcdma systems 003
Wcdma systems 003Wcdma systems 003
Wcdma systems 003Basil John
 
Fuzzy logic - copy
Fuzzy logic - copyFuzzy logic - copy
Fuzzy logic - copyBasil John
 
An advanced handoff algoritm in mobile communication network using fuzzy deci...
An advanced handoff algoritm in mobile communication network using fuzzy deci...An advanced handoff algoritm in mobile communication network using fuzzy deci...
An advanced handoff algoritm in mobile communication network using fuzzy deci...Basil John
 
Handoff survey
Handoff surveyHandoff survey
Handoff surveyBasil John
 
An efficient vertical handoff mechanism
An efficient vertical handoff mechanismAn efficient vertical handoff mechanism
An efficient vertical handoff mechanismBasil John
 
3.table of contents
3.table of contents3.table of contents
3.table of contentsBasil John
 
4. list of figures
4. list of figures4. list of figures
4. list of figuresBasil John
 
2.acknowledgement
2.acknowledgement2.acknowledgement
2.acknowledgementBasil John
 
5.list of abbreviations
5.list of abbreviations5.list of abbreviations
5.list of abbreviationsBasil John
 

More from Basil John (20)

Mm case 2 ref
Mm case 2 refMm case 2 ref
Mm case 2 ref
 
Class 5 ipr trips - trims
Class 5 ipr trips - trimsClass 5 ipr trips - trims
Class 5 ipr trips - trims
 
Hrm1
Hrm1Hrm1
Hrm1
 
Overcoming communication Barriers
Overcoming communication BarriersOvercoming communication Barriers
Overcoming communication Barriers
 
Luxottica
LuxotticaLuxottica
Luxottica
 
AADHAR Card- Database Creation
AADHAR Card- Database CreationAADHAR Card- Database Creation
AADHAR Card- Database Creation
 
Walmart f500
Walmart f500Walmart f500
Walmart f500
 
Interworking of wi_max_and_3gpp_networks_-slides
Interworking of wi_max_and_3gpp_networks_-slidesInterworking of wi_max_and_3gpp_networks_-slides
Interworking of wi_max_and_3gpp_networks_-slides
 
Wcdma systems 003
Wcdma systems 003Wcdma systems 003
Wcdma systems 003
 
Fuzzy logic - copy
Fuzzy logic - copyFuzzy logic - copy
Fuzzy logic - copy
 
An advanced handoff algoritm in mobile communication network using fuzzy deci...
An advanced handoff algoritm in mobile communication network using fuzzy deci...An advanced handoff algoritm in mobile communication network using fuzzy deci...
An advanced handoff algoritm in mobile communication network using fuzzy deci...
 
Handoff survey
Handoff surveyHandoff survey
Handoff survey
 
An efficient vertical handoff mechanism
An efficient vertical handoff mechanismAn efficient vertical handoff mechanism
An efficient vertical handoff mechanism
 
1. front page
1. front page1. front page
1. front page
 
3.table of contents
3.table of contents3.table of contents
3.table of contents
 
8.conclusion
8.conclusion8.conclusion
8.conclusion
 
4. list of figures
4. list of figures4. list of figures
4. list of figures
 
2.acknowledgement
2.acknowledgement2.acknowledgement
2.acknowledgement
 
6.abstract
6.abstract6.abstract
6.abstract
 
5.list of abbreviations
5.list of abbreviations5.list of abbreviations
5.list of abbreviations
 

Recently uploaded

shape functions of 1D and 2 D rectangular elements.pptx
shape functions of 1D and 2 D rectangular elements.pptxshape functions of 1D and 2 D rectangular elements.pptx
shape functions of 1D and 2 D rectangular elements.pptxVishalDeshpande27
 
ENERGY STORAGE DEVICES INTRODUCTION UNIT-I
ENERGY STORAGE DEVICES  INTRODUCTION UNIT-IENERGY STORAGE DEVICES  INTRODUCTION UNIT-I
ENERGY STORAGE DEVICES INTRODUCTION UNIT-IVigneshvaranMech
 
Online resume builder management system project report.pdf
Online resume builder management system project report.pdfOnline resume builder management system project report.pdf
Online resume builder management system project report.pdfKamal Acharya
 
Automobile Management System Project Report.pdf
Automobile Management System Project Report.pdfAutomobile Management System Project Report.pdf
Automobile Management System Project Report.pdfKamal Acharya
 
NO1 Pandit Amil Baba In Bahawalpur, Sargodha, Sialkot, Sheikhupura, Rahim Yar...
NO1 Pandit Amil Baba In Bahawalpur, Sargodha, Sialkot, Sheikhupura, Rahim Yar...NO1 Pandit Amil Baba In Bahawalpur, Sargodha, Sialkot, Sheikhupura, Rahim Yar...
NO1 Pandit Amil Baba In Bahawalpur, Sargodha, Sialkot, Sheikhupura, Rahim Yar...Amil baba
 
retail automation billing system ppt.pptx
retail automation billing system ppt.pptxretail automation billing system ppt.pptx
retail automation billing system ppt.pptxfaamieahmd
 
Hall booking system project report .pdf
Hall booking system project report  .pdfHall booking system project report  .pdf
Hall booking system project report .pdfKamal Acharya
 
Top 13 Famous Civil Engineering Scientist
Top 13 Famous Civil Engineering ScientistTop 13 Famous Civil Engineering Scientist
Top 13 Famous Civil Engineering Scientistgettygaming1
 
Democratizing Fuzzing at Scale by Abhishek Arya
Democratizing Fuzzing at Scale by Abhishek AryaDemocratizing Fuzzing at Scale by Abhishek Arya
Democratizing Fuzzing at Scale by Abhishek Aryaabh.arya
 
Quality defects in TMT Bars, Possible causes and Potential Solutions.
Quality defects in TMT Bars, Possible causes and Potential Solutions.Quality defects in TMT Bars, Possible causes and Potential Solutions.
Quality defects in TMT Bars, Possible causes and Potential Solutions.PrashantGoswami42
 
ONLINE CAR SERVICING SYSTEM PROJECT REPORT.pdf
ONLINE CAR SERVICING SYSTEM PROJECT REPORT.pdfONLINE CAR SERVICING SYSTEM PROJECT REPORT.pdf
ONLINE CAR SERVICING SYSTEM PROJECT REPORT.pdfKamal Acharya
 
RESORT MANAGEMENT AND RESERVATION SYSTEM PROJECT REPORT.pdf
RESORT MANAGEMENT AND RESERVATION SYSTEM PROJECT REPORT.pdfRESORT MANAGEMENT AND RESERVATION SYSTEM PROJECT REPORT.pdf
RESORT MANAGEMENT AND RESERVATION SYSTEM PROJECT REPORT.pdfKamal Acharya
 
Event Management System Vb Net Project Report.pdf
Event Management System Vb Net  Project Report.pdfEvent Management System Vb Net  Project Report.pdf
Event Management System Vb Net Project Report.pdfKamal Acharya
 
Halogenation process of chemical process industries
Halogenation process of chemical process industriesHalogenation process of chemical process industries
Halogenation process of chemical process industriesMuhammadTufail242431
 
2024 DevOps Pro Europe - Growing at the edge
2024 DevOps Pro Europe - Growing at the edge2024 DevOps Pro Europe - Growing at the edge
2024 DevOps Pro Europe - Growing at the edgePaco Orozco
 
İTÜ CAD and Reverse Engineering Workshop
İTÜ CAD and Reverse Engineering WorkshopİTÜ CAD and Reverse Engineering Workshop
İTÜ CAD and Reverse Engineering WorkshopEmre Günaydın
 
Arduino based vehicle speed tracker project
Arduino based vehicle speed tracker projectArduino based vehicle speed tracker project
Arduino based vehicle speed tracker projectRased Khan
 
The Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdfThe Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdfPipe Restoration Solutions
 
ASME IX(9) 2007 Full Version .pdf
ASME IX(9)  2007 Full Version       .pdfASME IX(9)  2007 Full Version       .pdf
ASME IX(9) 2007 Full Version .pdfAhmedHussein950959
 

Recently uploaded (20)

shape functions of 1D and 2 D rectangular elements.pptx
shape functions of 1D and 2 D rectangular elements.pptxshape functions of 1D and 2 D rectangular elements.pptx
shape functions of 1D and 2 D rectangular elements.pptx
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
 
ENERGY STORAGE DEVICES INTRODUCTION UNIT-I
ENERGY STORAGE DEVICES  INTRODUCTION UNIT-IENERGY STORAGE DEVICES  INTRODUCTION UNIT-I
ENERGY STORAGE DEVICES INTRODUCTION UNIT-I
 
Online resume builder management system project report.pdf
Online resume builder management system project report.pdfOnline resume builder management system project report.pdf
Online resume builder management system project report.pdf
 
Automobile Management System Project Report.pdf
Automobile Management System Project Report.pdfAutomobile Management System Project Report.pdf
Automobile Management System Project Report.pdf
 
NO1 Pandit Amil Baba In Bahawalpur, Sargodha, Sialkot, Sheikhupura, Rahim Yar...
NO1 Pandit Amil Baba In Bahawalpur, Sargodha, Sialkot, Sheikhupura, Rahim Yar...NO1 Pandit Amil Baba In Bahawalpur, Sargodha, Sialkot, Sheikhupura, Rahim Yar...
NO1 Pandit Amil Baba In Bahawalpur, Sargodha, Sialkot, Sheikhupura, Rahim Yar...
 
retail automation billing system ppt.pptx
retail automation billing system ppt.pptxretail automation billing system ppt.pptx
retail automation billing system ppt.pptx
 
Hall booking system project report .pdf
Hall booking system project report  .pdfHall booking system project report  .pdf
Hall booking system project report .pdf
 
Top 13 Famous Civil Engineering Scientist
Top 13 Famous Civil Engineering ScientistTop 13 Famous Civil Engineering Scientist
Top 13 Famous Civil Engineering Scientist
 
Democratizing Fuzzing at Scale by Abhishek Arya
Democratizing Fuzzing at Scale by Abhishek AryaDemocratizing Fuzzing at Scale by Abhishek Arya
Democratizing Fuzzing at Scale by Abhishek Arya
 
Quality defects in TMT Bars, Possible causes and Potential Solutions.
Quality defects in TMT Bars, Possible causes and Potential Solutions.Quality defects in TMT Bars, Possible causes and Potential Solutions.
Quality defects in TMT Bars, Possible causes and Potential Solutions.
 
ONLINE CAR SERVICING SYSTEM PROJECT REPORT.pdf
ONLINE CAR SERVICING SYSTEM PROJECT REPORT.pdfONLINE CAR SERVICING SYSTEM PROJECT REPORT.pdf
ONLINE CAR SERVICING SYSTEM PROJECT REPORT.pdf
 
RESORT MANAGEMENT AND RESERVATION SYSTEM PROJECT REPORT.pdf
RESORT MANAGEMENT AND RESERVATION SYSTEM PROJECT REPORT.pdfRESORT MANAGEMENT AND RESERVATION SYSTEM PROJECT REPORT.pdf
RESORT MANAGEMENT AND RESERVATION SYSTEM PROJECT REPORT.pdf
 
Event Management System Vb Net Project Report.pdf
Event Management System Vb Net  Project Report.pdfEvent Management System Vb Net  Project Report.pdf
Event Management System Vb Net Project Report.pdf
 
Halogenation process of chemical process industries
Halogenation process of chemical process industriesHalogenation process of chemical process industries
Halogenation process of chemical process industries
 
2024 DevOps Pro Europe - Growing at the edge
2024 DevOps Pro Europe - Growing at the edge2024 DevOps Pro Europe - Growing at the edge
2024 DevOps Pro Europe - Growing at the edge
 
İTÜ CAD and Reverse Engineering Workshop
İTÜ CAD and Reverse Engineering WorkshopİTÜ CAD and Reverse Engineering Workshop
İTÜ CAD and Reverse Engineering Workshop
 
Arduino based vehicle speed tracker project
Arduino based vehicle speed tracker projectArduino based vehicle speed tracker project
Arduino based vehicle speed tracker project
 
The Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdfThe Benefits and Techniques of Trenchless Pipe Repair.pdf
The Benefits and Techniques of Trenchless Pipe Repair.pdf
 
ASME IX(9) 2007 Full Version .pdf
ASME IX(9)  2007 Full Version       .pdfASME IX(9)  2007 Full Version       .pdf
ASME IX(9) 2007 Full Version .pdf
 

Neural networks

  • 2. Agenda History of Artificial Neural Networks What is an Artificial Neural Networks? How it works? Learning Learning paradigms Supervised learning Unsupervised learning Reinforcement learning  Applications areas Advantages and Disadvantages
  • 3. History of the Artificial Neural Networks  history of the ANNs stems from the 1940s, the decade of the first electronic computer.  However, the first important step took place in 1957 when Rosenblatt introduced the first concrete neural model, the perceptron. Rosenblatt also took part in constructing the first successful neurocomputer, the Mark I Perceptron. After this, the development of ANNs has proceeded as described in Figure.
  • 4. History of the Artificial Neural Networks  Rosenblatt's original perceptron model contained only one layer. From this, a multi-layered model was derived in 1960. At first, the use of the multi- layer perceptron (MLP) was complicated by the lack of a appropriate learning algorithm.  In 1974, Werbos came to introduce a so-called backpropagation algorithm for the three-layered perceptron network.
  • 5. History of the Artificial Neural Networks  in 1986, The application area of the MLP networks remained rather limited until the breakthrough when a general back propagation algorithm for a multi-layered perceptron was introduced by Rummelhart and Mclelland.  in 1982, Hopfield brought out his idea of a neural network. Unlike the neurons in MLP, the Hopfield network consists of only one layer whose neurons are fully connected with each other.
  • 6. History of the Artificial Neural Networks  Since then, new versions of the Hopfield network have been developed. The Boltzmann machine has been influenced by both the Hopfield network and the MLP.
  • 7. History of the Artificial Neural Networks  in 1988, Radial Basis Function (RBF) networks were first introduced by Broomhead & Lowe. Although the basic idea of RBF was developed 30 years ago under the name method of potential function, the work by Broomhead & Lowe opened a new frontier in the neural network community.
  • 8. History of the Artificial Neural Networks  in 1982, A totally unique kind of network model is the Self-Organizing Map (SOM) introduced by Kohonen. SOM is a certain kind of topological map which organizes itself based on the input patterns that it is trained with. The SOM originated from the LVQ (Learning Vector Quantization) network the underlying idea of which was also Kohonen's in 1972.
  • 9. History of Artificial Neural Networks Since then, research on artificial neural networks has remained active, leading to many new network types, as well as hybrid algorithms and hardware for neural information processing.
  • 10. Artificial Neural Network An artificial neural network consists of a pool of simple processing units which communicate by sending signals to each other over a large number of weighted connections.
  • 11. Artificial Neural Network A set of major aspects of a parallel distributed model include:  a set of processing units (cells).  a state of activation for every unit, which equivalent to the output of the unit.  connections between the units. Generally each connection is defined by a weight.  a propagation rule, which determines the effective input of a unit from its external inputs.  an activation function, which determines the new level of activation based on the effective input and the current activation.  an external input for each unit.  a method for information gathering (the learning rule).  an environment within which the system must operate, providing input signals and _ if necessary _ error signals.
  • 12. Computers vs. Neural Networks “Standard” Computers Neural Networks  one CPU highly parallel processing fast processing units slow processing units reliable units unreliable units static infrastructure dynamic infrastructure
  • 13. Why Artificial Neural Networks? There are two basic reasons why we are interested in building artificial neural networks (ANNs): • Technical viewpoint: Some problems such as character recognition or the prediction of future states of a system require massively parallel and adaptive processing. • Biological viewpoint: ANNs can be used to replicate and simulate components of the human (or animal) brain, thereby giving us insight into natural information processing.
  • 14. Artificial Neural Networks • The “building blocks” of neural networks are the neurons. • In technical systems, we also refer to them as units or nodes. • Basically, each neuron  receives input from many other neurons.  changes its internal state (activation) based on the current input.  sends one output signal to many other neurons, possibly including its input neurons (recurrent network).
  • 15. Artificial Neural Networks • Information is transmitted as a series of electric impulses, so-called spikes. • The frequency and phase of these spikes encodes the information. • In biological systems, one neuron can be connected to as many as 10,000 other neurons. • Usually, a neuron receives its information from other neurons in a confined area, its so-called receptive field.
  • 16. How do ANNs work? An artificial neural network (ANN) is either a hardware implementation or a computer program which strives to simulate the information processing capabilities of its biological exemplar. ANNs are typically composed of a great number of interconnected artificial neurons. The artificial neurons are simplified models of their biological counterparts. ANN is a technique for solving problems by constructing software that works like our brains.
  • 17. How do our brains work?  The Brain is A massively parallel information processing system.  Our brains are a huge network of processing elements. A typical brain contains a network of 10 billion neurons.
  • 18. How do our brains work?  A processing element Dendrites: Input Cell body: Processor Synaptic: Link Axon: Output
  • 19. How do our brains work?  A processing element A neuron is connected to other neurons through about 10,000 synapses
  • 20. How do our brains work?  A processing element A neuron receives input from other neurons. Inputs are combined.
  • 21. How do our brains work?  A processing element Once input exceeds a critical level, the neuron discharges a spike ‐ an electrical pulse that travels from the body, down the axon, to the next neuron(s)
  • 22. How do our brains work?  A processing element The axon endings almost touch the dendrites or cell body of the next neuron.
  • 23. How do our brains work?  A processing element Transmission of an electrical signal from one neuron to the next is effected by neurotransmitters.
  • 24. How do our brains work?  A processing element Neurotransmitters are chemicals which are released from the first neuron and which bind to the Second.
  • 25. How do our brains work?  A processing element This link is called a synapse. The strength of the signal that reaches the next neuron depends on factors such as the amount of neurotransmitter available.
  • 26. How do ANNs work? An artificial neuron is an imitation of a human neuron
  • 27. How do ANNs work? • Now, let us have a look at the model of an artificial neuron.
  • 28. How do ANNs work? Output x1x2xm ∑ y Processing Input ∑= X1+X2 + ….+Xm =y . . . . . . . . . . . .
  • 29. How do ANNs work? Not all inputs are equal Output x1x2xm ∑ y Processing Input ∑= X1w1+X2w2 + ….+Xmwm =y w1w2 wm weights . . . . . . . . . . . . . . . . .
  • 30. How do ANNs work? The signal is not passed down to the next neuron verbatim Transfer Function (Activation Function) Output x1x2xm ∑ y Processing Input w1 w2wm weights . . . . . . . . . . . . f(vk) . . . . .
  • 31. The output is a function of the input, that is affected by the weights, and the transfer functions
  • 32. Three types of layers: Input, Hidden, and Output
  • 33. Artificial Neural Networks An ANN can: 1. compute any computable function, by the appropriate selection of the network topology and weights values. 2. learn from experience!  Specifically, by trial and error‐ ‐
  • 34. Learning by trial and error‐ ‐ Continuous process of: Trial: Processing an input to produce an output (In terms of ANN: Compute the output function of a given input) Evaluate: Evaluating this output by comparing the actual output with the expected output. Adjust: Adjust the weights.
  • 35. Example: XOR Hidden Layer, with three neurons Output Layer, with one neuron Input Layer, with two neurons
  • 36. How it works? Hidden Layer, with three neurons Output Layer, with one neuron Input Layer, with two neurons
  • 37. How it works? Set initial values of the weights randomly. Input: truth table of the XOR Do  Read input (e.g. 0, and 0)  Compute an output (e.g. 0.60543)  Compare it to the expected output. (Diff= 0.60543)  Modify the weights accordingly. Loop until a condition is met  Condition: certain number of iterations  Condition: error threshold
  • 38. Design Issues Initial weights (small random values [ 1,1])∈ ‐ Transfer function (How the inputs and the weights are combined to produce output?) Error estimation Weights adjusting Number of neurons Data representation Size of training set
  • 39. Transfer Functions Linear: The output is proportional to the total weighted input. Threshold: The output is set at one of two values, depending on whether the total weighted input is greater than or less than some threshold value. Non linear:‐ The output varies continuously but not linearly as the input changes.
  • 40. Error Estimation The root mean square error (RMSE) is a frequently-used measure of the differences between values predicted by a model or an estimator and the values actually observed from the thing being modeled or estimated
  • 41. Weights Adjusting After each iteration, weights should be adjusted to minimize the error. – All possible weights – Back propagation
  • 42. Back Propagation Back-propagation is an example of supervised learning is used at each layer to minimize the error between the layer’s response and the actual data The error at each hidden layer is an average of the evaluated error Hidden layer networks are trained this way
  • 43. Back Propagation N is a neuron. Nw is one of N’s inputs weights Nout is N’s output. Nw = Nw +Δ Nw Δ Nw = Nout * (1‐ Nout)* NErrorFactor NErrorFactor = NExpectedOutput – NActualOutput This works only for the last layer, as we can know the actual output, and the expected output.
  • 44. Number of neuronsMany neurons: Higher accuracy Slower Risk of over fitting‐ Memorizing, rather than understanding The network will be useless with new problems. Few neurons:  Lower accuracy Inability to learn at all Optimal number.
  • 45. Data representation Usually input/output data needs pre processing‐  Pictures  Pixel intensity Text: A pattern
  • 46. Size of training set No one fits all formula‐ ‐ Over fitting can occur if a “good” training set is not chosen What constitutes a “good” training set?  Samples must represent the general population. Samples must contain members of each class. Samples in each class must contain a wide range of variations or noise effect. The size of the training set is related to the number of hidden neurons
  • 47. Learning Paradigms Supervised learning Unsupervised learning Reinforcement learning
  • 48. Supervised learning This is what we have seen so far! A network is fed with a set of training samples (inputs and corresponding output), and it uses these samples to learn the general relationship between the inputs and the outputs. This relationship is represented by the values of the weights of the trained network.
  • 49. Unsupervised learning No desired output is associated with the training data! Faster than supervised learning Used to find out structures within data:  Clustering  Compression
  • 50. Reinforcement learning Like supervised learning, but: Weights adjusting is not directly related to the error value. The error value is used to randomly, shuffle weights! Relatively slow learning due to ‘randomness’.
  • 51. Applications Areas Function approximation including time series prediction and modeling. Classification including patterns and sequences recognition, novelty detection and sequential decision making. (radar systems, face identification, handwritten text recognition) Data processing including filtering, clustering blinds source separation and compression. (data mining, e-mail Spam filtering)
  • 52. Advantages / Disadvantages Advantages Adapt to unknown situations Powerful, it can model complex functions. Ease of use, learns by example, and very little user domain specific expertise needed‐ Disadvantages Forgets Not exact Large complexity of the network structure
  • 53. Conclusion Artificial Neural Networks are an imitation of the biological neural networks, but much simpler ones. The computing would have a lot to gain from neural networks. Their ability to learn by example makes them very flexible and powerful furthermore there is need to device an algorithm in order to perform a specific task.
  • 54. Conclusion Neural networks also contributes to area of research such a neurology and psychology. They are regularly used to model parts of living organizations and to investigate the internal mechanisms of the brain. Many factors affect the performance of ANNs, such as the transfer functions, size of training sample, network topology, weights adjusting algorithm, …
  • 55. References  Craig Heller, and David Sadava, Life: The Science of Biology, fifth edition, Sinauer Associates, INC, USA, 1998.  Introduction to Artificial Neural Networks, Nicolas Galoppo von Borries  Tom M. Mitchell, Machine Learning, WCB McGraw-Hill, Boston, 1997.
  • 57. Q. How does each neuron work in ANNS? What is back propagation? A neuron: receives input from many other neurons; changes its internal state (activation) based on the current input; sends one output signal to many other neurons, possibly including its input neurons (ANN is recurrent network). Back-propagation is a type of supervised learning, used at each layer to minimize the error between the layer’s response and the actual data.