SlideShare a Scribd company logo
1 of 31
Download to read offline
1
Renas R. Rekany-Nawroz University
Artificial Neural Networks
Renas R. Rekany
2016/2017Computer Science & I.T.
2
Renas R. Rekany-Nawroz University
Artificial Neural Networks
An artificial neural network (ANN) is a system is based
(inspired) on the biological neural network, such as the brain.
The brain has approximately 100 billions neurons, which
communicate through electrochemical signals (the neurons
are connected through a junction called synapse).
Each neuron receives thousands of connections with other
neurons, constantly receiving incoming signals to reach the
cell body.
3
Renas R. Rekany-Nawroz University
Biological Neuron
The information transmission happens at the synapses.
A biological neuron is most basic information processing unit in the
nervous system. a biological neuron consists of the following parts:
1. Dendrites (input)
2. Cell body
3. Axon (output)
A biological neuron takes signals from it’s dendrites and processes
the signal and outputs a signal from it’s axon based on the input
signal.
4
Renas R. Rekany-Nawroz University
Model of an ANN
5
Renas R. Rekany-Nawroz University
Universal Properties of Neurons
Excitability
All cells are excitable that is, they respond to environment
changes. Neurons exhibit this property to the highest degree.
Conductivity
Neurons respond to stimuli by producing electrical signals that
are quickly conducted to other cells at distant locations.
Secretion
When the electrical signal reaches the end of a nerve fiber,
the neuron secretes a chemical neurotransmitter that crosses
the gap and stimulates the next cell.
6
Renas R. Rekany-Nawroz University
Properties of Neurons System
 Parallel, distributed information processing.
 High degree of connectivity between basic
processing units.
 Connections are modified based of experience.
 Learning is a constant process.
 Learning is based on local information.
7
Renas R. Rekany-Nawroz University
Neurons
8
Renas R. Rekany-Nawroz University
Model of an ANN
1. x1, x2,….,xn are the inputs to the neuron.
2. w1, w2,…., wn are weights applied to the inputs.
3. net is the x1*w1+x2*w2+…+xn*wn is the weight input sum
4. f() is the activation function.
5. y= f(net) is the output of the function
9
Renas R. Rekany-Nawroz University
Artificial Neural Network Architecture
Single Layer Neural Networks, are networks in which the output is
directly passed from the input neurons to the output neurons without
having any hidden processing neurons in between.
These are called Perceptron neurons, and are usually used to resolve
simple mathematical models.
10
Renas R. Rekany-Nawroz University
Artificial Neural Network Architecture
Non-linear function
Step (operands) Sign(Input)
11
Renas R. Rekany-Nawroz University
Artificial Neural Network Architecture
Multi-Layer Neural Networks, are neural networks in which input
neurons pass signals and information to other processing elements
inside a hidden layer, afterwards information is passed to the output
neurons.
These are called back
propagation neurons,
which usually solve
complex problems.
12
Renas R. Rekany-Nawroz University
Learning
This procedure of learning involves updating the network parameters
so that the network can perform a specific task as desired.
This involves testing the network and performing certain procedures
to update the weights such that a desired output is met.
There are two type of learning
• Supervised learning
• Unsupervised learning
13
Renas R. Rekany-Nawroz University
Learning
Supervised learning
In supervised learning a well defined set of inputs and outputs are
provided to the network, this enables the network to generalize it’s
process so that when introduced with a set of inputs, the desired
output is met.
Unsupervised learning
In unsupervised learning, there is a set of inputs without a well
defined set of outputs, such networks try to generalize certain
characteristic in the input data and classify the data accordingly.
14
Renas R. Rekany-Nawroz University
ANN Applications
• Machine vision
• Pattern recognition
• Intelligent security systems
• Intelligent medical devices
• Intelligent control
• Advanced robotics
• Intelligent signal processing and data analysis
15
Renas R. Rekany-Nawroz University
The Concept of Linear Separability
The concept of linear separability is based
on mapping the outputs of a function on
the axis of the inputs.
Example: mapping an “AND” gate can
be as follows when the output can be
separated by a single-line, the problem
can be resolved with perceptron
networks, however, if it takes more than
one line to separate the output, then back
propagation networks must be used. for
example:
16
Renas R. Rekany-Nawroz University
Perceptron Neural Network
A perceptron neural network is a single layer network where an input
is passed to the activation function and an output is generated.
Perceptrons are used to map linear classifiers, in which an input
belongs to one class or another. these neural networks are trained
using supervised learning methods, and is usually provided with the
Hardlim activation function.
17
Renas R. Rekany-Nawroz University
Perceptron Characteristics
• Single Layer Network
• Supervised learning method
• Hardlim activation function
• X1,X2, …, Xn are inputs
• W1,W2, …, Wn are weights
applied to the inputs
• Bias or threshold is the
limit by which the output
is decided
• Net = X1W1+X2W2, … XnWn
• α Alpha is the learning rate (speed)
• f(net - 𝛳) is the activation function
X1
X2
Xn
Net
W1
W2
Wn
Bais 𝛳
y(output)f(net)
18
Renas R. Rekany-Nawroz University
Artificial neurons Neuron
19
Renas R. Rekany-Nawroz University
Artificial neurons
one possible model
Inputs
Output
w2
w1
w3
wn
wn-1
.
.
.
x1
x2
x3
…
xn-1
xn
y
)(;
1
zHyxwz
n
i
ii  
20
Renas R. Rekany-Nawroz University
From Logical Neurons to Finite
Automata
AND
1
1
1.5
NOT
-1
0
OR
1
1
0.5
21
Renas R. Rekany-Nawroz University
Neural network mathematics
Inputs
Output
),(
),(
),(
),(
1
44
1
4
1
33
1
3
1
22
1
2
1
11
1
1
wxfy
wxfy
wxfy
wxfy




),(
),(
),(
2
3
12
3
2
2
12
2
2
1
12
1
wyfy
wyfy
wyfy


















1
4
1
3
1
2
1
1
1
y
y
y
y
y ),( 3
1
2
wyfyOut 











2
3
2
3
2
3
2
y
y
y
y
22
Renas R. Rekany-Nawroz University
Neural network mathematics
Neural network: input / output transformation
),( WxFyout 
W is the matrix of all weight vectors.
23
Renas R. Rekany-Nawroz University
Perceptron Learning
Perceptron learning is based on calculating the error coefficient using
the below equation:
e = y_desired – y_actual
Afterwards, we update the weights using delta weight, based on the
below equation:
ΔW = (α)(Xi)(e)
Delta weight is added to each output contributing weight using:
Wi = Wi + ΔW
24
Renas R. Rekany-Nawroz University
Example
Building a perceptron neural network with one perceptron to perform
the action of the logical AND gate.
the input vectors are X1 = [0,1] and X2 = [0,1], the input set is
p = [0,0;0,1;1,0;1,1] and the respective target vector is t = [0;0;0;1].
assume that 𝛳 = 0.2, α = 0.1, W1 = 0.3 and W2 = -0.1.
epoch X1 X2 Yd Yact e ΔW1 ΔW2 W1 W2
1
0 0 0 0 0 0 0 0.3 -0.1
0 1 0 0 0 0 0 0.3 -0.1
1 0 0 1 -1 -0.1 0 0.2 -0.1
1 1 1 0 1 0.1 0.1 0.3 0
25
Renas R. Rekany-Nawroz University
Example
𝛳 = 0.2, α = 0.1.
epoch X1 X2 Yd Yact e ΔW1 ΔW2 W1 W2
2
0 0 0 0 0 0 0 0.3 0
0 1 0 0 0 0 0 0.3 0
1 0 0 1 -1 -0.1 0 0.2 0
1 1 1 1 0 0 0 0.2 0
epoch X1 X2 Yd Yact e ΔW1 ΔW2 W1 W2
3
0 0 0 0 0 0 0 0.2 0
0 1 0 0 0 0 0 0.2 0
1 0 0 1 -1 -0.1 0 0.1 0
1 1 1 0 1 0.1 0.1 0.3 0.1
26
Renas R. Rekany-Nawroz University
Example
𝛳 = 0.2, α = 0.1.
epoch X1 X2 Yd Yact e ΔW1 ΔW2 W1 W2
4
0 0 0 0 0 0 0 0.3 0.1
0 1 0 0 0 0 0 0.3 0.1
1 0 0 1 -1 -0.1 0 0.2 0.1
1 1 1 1 0 0 0 0.2 0.1
epoch X1 X2 Yd Yact e ΔW1 ΔW2 W1 W2
5
0 0 0 0 0 0 0 0.2 0.1
0 1 0 0 0 0 0 0.2 0.1
1 0 0 1 -1 -0.1 0 0.1 0.1
1 1 1 1 0 0 0 0.1 0.1
27
Renas R. Rekany-Nawroz University
Example
The last epoch for this example according the input vectors are
X1 = [0,1] and X2 = [0,1], the input set is p = [0,0;0,1;1,0;1,1] and
the respective target vector is t = [0;0;0;1].
Hint: 𝛳 = 0.2, α = 0.1, W1 = 0.3 and W2 = -0.1.
epoch X1 X2 Yd Yact e ΔW1 ΔW2 W1 W2
6
0 0 0 0 0 0 0 0.1 0.1
0 1 0 0 0 0 0 0.1 0.1
1 0 0 0 0 0 0 0.1 0.1
1 1 1 1 0 0 0 0.1 0.1
28
Renas R. Rekany-Nawroz University
NOT Gate
The last epoch for this example according the input vectors are X =
[0,1], the input set is p = [0;1] and the respective target vector is t =
[1;0]. Hint: 𝛳 = 0, α = 0.1, W = -1.
epoch x Yd Yact e ΔW W1
1
0 1 1 0 0 -1
1 0 0 0 0 -1
29
Renas R. Rekany-Nawroz University
Multi-Layer Perceptron
• One or more hidden
layers
• Sigmoid activations
functions
1st hidden
layer
2nd hidden
layer
Output layer
Input data
30
Renas R. Rekany-Nawroz University
Structure
Types of
Decision Regions
Result
Single-Layer
Two-Layer
Three-Layer
Half Plane
Bounded By
Hyperplane
Convex Open
Or
Closed Regions
Abitrary
(Complexity
Limited by No.
of Nodes)
A
AB
B
A
AB
B
A
AB
B
Multi-Layer Perceptron Application
31
Renas R. Rekany-Nawroz University
References
1. http://neuron.eng.wayne.edu/software.html
Many useful example.
2. http://ieee.uow.edu.au/~daniel/software/libn
eural/BPN_tutorial/BPN_English/BPN_Engli
sh/BPN_English.html
3. http://www.ai-junkie.com/
4. http://diwww.epfl.ch/mantra/tutorial/english/

More Related Content

What's hot

Neural networks
Neural networksNeural networks
Neural networks
Slideshare
 
what is neural network....???
what is neural network....???what is neural network....???
what is neural network....???
Adii Shah
 
Convolution Neural Networks
Convolution Neural NetworksConvolution Neural Networks
Convolution Neural Networks
AhmedMahany
 

What's hot (20)

Neural networks
Neural networksNeural networks
Neural networks
 
Artificial Neural Networks
Artificial Neural NetworksArtificial Neural Networks
Artificial Neural Networks
 
Secondary structure prediction
Secondary structure predictionSecondary structure prediction
Secondary structure prediction
 
Artificial Neural Networks for NIU
Artificial Neural Networks for NIUArtificial Neural Networks for NIU
Artificial Neural Networks for NIU
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Neural network
Neural networkNeural network
Neural network
 
Neural network
Neural networkNeural network
Neural network
 
Unit+i
Unit+iUnit+i
Unit+i
 
Application of artificial_neural_network
Application of artificial_neural_networkApplication of artificial_neural_network
Application of artificial_neural_network
 
Neural networks
Neural networksNeural networks
Neural networks
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Multilayer Backpropagation Neural Networks for Implementation of Logic Gates
Multilayer Backpropagation Neural Networks for Implementation of Logic GatesMultilayer Backpropagation Neural Networks for Implementation of Logic Gates
Multilayer Backpropagation Neural Networks for Implementation of Logic Gates
 
Neural network
Neural networkNeural network
Neural network
 
what is neural network....???
what is neural network....???what is neural network....???
what is neural network....???
 
Convolution Neural Networks
Convolution Neural NetworksConvolution Neural Networks
Convolution Neural Networks
 
Neural networks
Neural networksNeural networks
Neural networks
 
Artificial Neural Networks
Artificial Neural NetworksArtificial Neural Networks
Artificial Neural Networks
 
Neural network
Neural networkNeural network
Neural network
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 

Similar to Artificial Neural Network

SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1
sravanthi computers
 
Artificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxArtificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptx
pratik610182
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1
ncct
 
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
Simplilearn
 

Similar to Artificial Neural Network (20)

10-Perceptron.pdf
10-Perceptron.pdf10-Perceptron.pdf
10-Perceptron.pdf
 
SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1
 
Artificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxArtificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptx
 
Machine Learning - Neural Networks - Perceptron
Machine Learning - Neural Networks - PerceptronMachine Learning - Neural Networks - Perceptron
Machine Learning - Neural Networks - Perceptron
 
Machine Learning - Introduction to Neural Networks
Machine Learning - Introduction to Neural NetworksMachine Learning - Introduction to Neural Networks
Machine Learning - Introduction to Neural Networks
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1
 
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptxArtificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
Artificial Neural Networks (ANNs) focusing on the perceptron Algorithm.pptx
 
ACUMENS ON NEURAL NET AKG 20 7 23.pptx
ACUMENS ON NEURAL NET AKG 20 7 23.pptxACUMENS ON NEURAL NET AKG 20 7 23.pptx
ACUMENS ON NEURAL NET AKG 20 7 23.pptx
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
 
19_Learning.ppt
19_Learning.ppt19_Learning.ppt
19_Learning.ppt
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)
 
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
Deep Learning Tutorial | Deep Learning TensorFlow | Deep Learning With Neural...
 
Neural Network.pptx
Neural Network.pptxNeural Network.pptx
Neural Network.pptx
 
ANN.ppt
ANN.pptANN.ppt
ANN.ppt
 
Lec-02.pdf
Lec-02.pdfLec-02.pdf
Lec-02.pdf
 
Soft Computering Technics - Unit2
Soft Computering Technics - Unit2Soft Computering Technics - Unit2
Soft Computering Technics - Unit2
 
Introduction to Neural networks (under graduate course) Lecture 2 of 9
Introduction to Neural networks (under graduate course) Lecture 2 of 9Introduction to Neural networks (under graduate course) Lecture 2 of 9
Introduction to Neural networks (under graduate course) Lecture 2 of 9
 
071bct537 lab4
071bct537 lab4071bct537 lab4
071bct537 lab4
 

More from Renas Rekany

Renas Rajab Asaad
Renas Rajab Asaad Renas Rajab Asaad
Renas Rajab Asaad
Renas Rekany
 
Renas Rajab Asaad
Renas Rajab Asaad Renas Rajab Asaad
Renas Rajab Asaad
Renas Rekany
 

More from Renas Rekany (20)

decision making
decision makingdecision making
decision making
 
AI heuristic search
AI heuristic searchAI heuristic search
AI heuristic search
 
AI local search
AI local searchAI local search
AI local search
 
AI simple search strategies
AI simple search strategiesAI simple search strategies
AI simple search strategies
 
C# p9
C# p9C# p9
C# p9
 
C# p8
C# p8C# p8
C# p8
 
C# p7
C# p7C# p7
C# p7
 
C# p6
C# p6C# p6
C# p6
 
C# p5
C# p5C# p5
C# p5
 
C# p4
C# p4C# p4
C# p4
 
C# p3
C# p3C# p3
C# p3
 
C# p2
C# p2C# p2
C# p2
 
C# p1
C# p1C# p1
C# p1
 
C# with Renas
C# with RenasC# with Renas
C# with Renas
 
Object oriented programming inheritance
Object oriented programming inheritanceObject oriented programming inheritance
Object oriented programming inheritance
 
Object oriented programming
Object oriented programmingObject oriented programming
Object oriented programming
 
Renas Rajab Asaad
Renas Rajab Asaad Renas Rajab Asaad
Renas Rajab Asaad
 
Renas Rajab Asaad
Renas Rajab AsaadRenas Rajab Asaad
Renas Rajab Asaad
 
Renas Rajab Asaad
Renas Rajab Asaad Renas Rajab Asaad
Renas Rajab Asaad
 
Renas Rajab Asaad
Renas Rajab Asaad Renas Rajab Asaad
Renas Rajab Asaad
 

Recently uploaded

Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
EADTU
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
heathfieldcps1
 

Recently uploaded (20)

Model Attribute _rec_name in the Odoo 17
Model Attribute _rec_name in the Odoo 17Model Attribute _rec_name in the Odoo 17
Model Attribute _rec_name in the Odoo 17
 
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
NO1 Top Black Magic Specialist In Lahore Black magic In Pakistan Kala Ilam Ex...
 
Tatlong Kwento ni Lola basyang-1.pdf arts
Tatlong Kwento ni Lola basyang-1.pdf artsTatlong Kwento ni Lola basyang-1.pdf arts
Tatlong Kwento ni Lola basyang-1.pdf arts
 
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
Beyond_Borders_Understanding_Anime_and_Manga_Fandom_A_Comprehensive_Audience_...
 
How to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxHow to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptx
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Introduction to TechSoup’s Digital Marketing Services and Use Cases
Introduction to TechSoup’s Digital Marketing  Services and Use CasesIntroduction to TechSoup’s Digital Marketing  Services and Use Cases
Introduction to TechSoup’s Digital Marketing Services and Use Cases
 
REMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptxREMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptx
 
Play hard learn harder: The Serious Business of Play
Play hard learn harder:  The Serious Business of PlayPlay hard learn harder:  The Serious Business of Play
Play hard learn harder: The Serious Business of Play
 
AIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.pptAIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.ppt
 
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
Transparency, Recognition and the role of eSealing - Ildiko Mazar and Koen No...
 
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
 
How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdfUGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
 
21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptx21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptx
 
How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17
 
How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptxHMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
HMCS Max Bernays Pre-Deployment Brief (May 2024).pptx
 

Artificial Neural Network

  • 1. 1 Renas R. Rekany-Nawroz University Artificial Neural Networks Renas R. Rekany 2016/2017Computer Science & I.T.
  • 2. 2 Renas R. Rekany-Nawroz University Artificial Neural Networks An artificial neural network (ANN) is a system is based (inspired) on the biological neural network, such as the brain. The brain has approximately 100 billions neurons, which communicate through electrochemical signals (the neurons are connected through a junction called synapse). Each neuron receives thousands of connections with other neurons, constantly receiving incoming signals to reach the cell body.
  • 3. 3 Renas R. Rekany-Nawroz University Biological Neuron The information transmission happens at the synapses. A biological neuron is most basic information processing unit in the nervous system. a biological neuron consists of the following parts: 1. Dendrites (input) 2. Cell body 3. Axon (output) A biological neuron takes signals from it’s dendrites and processes the signal and outputs a signal from it’s axon based on the input signal.
  • 4. 4 Renas R. Rekany-Nawroz University Model of an ANN
  • 5. 5 Renas R. Rekany-Nawroz University Universal Properties of Neurons Excitability All cells are excitable that is, they respond to environment changes. Neurons exhibit this property to the highest degree. Conductivity Neurons respond to stimuli by producing electrical signals that are quickly conducted to other cells at distant locations. Secretion When the electrical signal reaches the end of a nerve fiber, the neuron secretes a chemical neurotransmitter that crosses the gap and stimulates the next cell.
  • 6. 6 Renas R. Rekany-Nawroz University Properties of Neurons System  Parallel, distributed information processing.  High degree of connectivity between basic processing units.  Connections are modified based of experience.  Learning is a constant process.  Learning is based on local information.
  • 7. 7 Renas R. Rekany-Nawroz University Neurons
  • 8. 8 Renas R. Rekany-Nawroz University Model of an ANN 1. x1, x2,….,xn are the inputs to the neuron. 2. w1, w2,…., wn are weights applied to the inputs. 3. net is the x1*w1+x2*w2+…+xn*wn is the weight input sum 4. f() is the activation function. 5. y= f(net) is the output of the function
  • 9. 9 Renas R. Rekany-Nawroz University Artificial Neural Network Architecture Single Layer Neural Networks, are networks in which the output is directly passed from the input neurons to the output neurons without having any hidden processing neurons in between. These are called Perceptron neurons, and are usually used to resolve simple mathematical models.
  • 10. 10 Renas R. Rekany-Nawroz University Artificial Neural Network Architecture Non-linear function Step (operands) Sign(Input)
  • 11. 11 Renas R. Rekany-Nawroz University Artificial Neural Network Architecture Multi-Layer Neural Networks, are neural networks in which input neurons pass signals and information to other processing elements inside a hidden layer, afterwards information is passed to the output neurons. These are called back propagation neurons, which usually solve complex problems.
  • 12. 12 Renas R. Rekany-Nawroz University Learning This procedure of learning involves updating the network parameters so that the network can perform a specific task as desired. This involves testing the network and performing certain procedures to update the weights such that a desired output is met. There are two type of learning • Supervised learning • Unsupervised learning
  • 13. 13 Renas R. Rekany-Nawroz University Learning Supervised learning In supervised learning a well defined set of inputs and outputs are provided to the network, this enables the network to generalize it’s process so that when introduced with a set of inputs, the desired output is met. Unsupervised learning In unsupervised learning, there is a set of inputs without a well defined set of outputs, such networks try to generalize certain characteristic in the input data and classify the data accordingly.
  • 14. 14 Renas R. Rekany-Nawroz University ANN Applications • Machine vision • Pattern recognition • Intelligent security systems • Intelligent medical devices • Intelligent control • Advanced robotics • Intelligent signal processing and data analysis
  • 15. 15 Renas R. Rekany-Nawroz University The Concept of Linear Separability The concept of linear separability is based on mapping the outputs of a function on the axis of the inputs. Example: mapping an “AND” gate can be as follows when the output can be separated by a single-line, the problem can be resolved with perceptron networks, however, if it takes more than one line to separate the output, then back propagation networks must be used. for example:
  • 16. 16 Renas R. Rekany-Nawroz University Perceptron Neural Network A perceptron neural network is a single layer network where an input is passed to the activation function and an output is generated. Perceptrons are used to map linear classifiers, in which an input belongs to one class or another. these neural networks are trained using supervised learning methods, and is usually provided with the Hardlim activation function.
  • 17. 17 Renas R. Rekany-Nawroz University Perceptron Characteristics • Single Layer Network • Supervised learning method • Hardlim activation function • X1,X2, …, Xn are inputs • W1,W2, …, Wn are weights applied to the inputs • Bias or threshold is the limit by which the output is decided • Net = X1W1+X2W2, … XnWn • α Alpha is the learning rate (speed) • f(net - 𝛳) is the activation function X1 X2 Xn Net W1 W2 Wn Bais 𝛳 y(output)f(net)
  • 18. 18 Renas R. Rekany-Nawroz University Artificial neurons Neuron
  • 19. 19 Renas R. Rekany-Nawroz University Artificial neurons one possible model Inputs Output w2 w1 w3 wn wn-1 . . . x1 x2 x3 … xn-1 xn y )(; 1 zHyxwz n i ii  
  • 20. 20 Renas R. Rekany-Nawroz University From Logical Neurons to Finite Automata AND 1 1 1.5 NOT -1 0 OR 1 1 0.5
  • 21. 21 Renas R. Rekany-Nawroz University Neural network mathematics Inputs Output ),( ),( ),( ),( 1 44 1 4 1 33 1 3 1 22 1 2 1 11 1 1 wxfy wxfy wxfy wxfy     ),( ),( ),( 2 3 12 3 2 2 12 2 2 1 12 1 wyfy wyfy wyfy                   1 4 1 3 1 2 1 1 1 y y y y y ),( 3 1 2 wyfyOut             2 3 2 3 2 3 2 y y y y
  • 22. 22 Renas R. Rekany-Nawroz University Neural network mathematics Neural network: input / output transformation ),( WxFyout  W is the matrix of all weight vectors.
  • 23. 23 Renas R. Rekany-Nawroz University Perceptron Learning Perceptron learning is based on calculating the error coefficient using the below equation: e = y_desired – y_actual Afterwards, we update the weights using delta weight, based on the below equation: ΔW = (α)(Xi)(e) Delta weight is added to each output contributing weight using: Wi = Wi + ΔW
  • 24. 24 Renas R. Rekany-Nawroz University Example Building a perceptron neural network with one perceptron to perform the action of the logical AND gate. the input vectors are X1 = [0,1] and X2 = [0,1], the input set is p = [0,0;0,1;1,0;1,1] and the respective target vector is t = [0;0;0;1]. assume that 𝛳 = 0.2, α = 0.1, W1 = 0.3 and W2 = -0.1. epoch X1 X2 Yd Yact e ΔW1 ΔW2 W1 W2 1 0 0 0 0 0 0 0 0.3 -0.1 0 1 0 0 0 0 0 0.3 -0.1 1 0 0 1 -1 -0.1 0 0.2 -0.1 1 1 1 0 1 0.1 0.1 0.3 0
  • 25. 25 Renas R. Rekany-Nawroz University Example 𝛳 = 0.2, α = 0.1. epoch X1 X2 Yd Yact e ΔW1 ΔW2 W1 W2 2 0 0 0 0 0 0 0 0.3 0 0 1 0 0 0 0 0 0.3 0 1 0 0 1 -1 -0.1 0 0.2 0 1 1 1 1 0 0 0 0.2 0 epoch X1 X2 Yd Yact e ΔW1 ΔW2 W1 W2 3 0 0 0 0 0 0 0 0.2 0 0 1 0 0 0 0 0 0.2 0 1 0 0 1 -1 -0.1 0 0.1 0 1 1 1 0 1 0.1 0.1 0.3 0.1
  • 26. 26 Renas R. Rekany-Nawroz University Example 𝛳 = 0.2, α = 0.1. epoch X1 X2 Yd Yact e ΔW1 ΔW2 W1 W2 4 0 0 0 0 0 0 0 0.3 0.1 0 1 0 0 0 0 0 0.3 0.1 1 0 0 1 -1 -0.1 0 0.2 0.1 1 1 1 1 0 0 0 0.2 0.1 epoch X1 X2 Yd Yact e ΔW1 ΔW2 W1 W2 5 0 0 0 0 0 0 0 0.2 0.1 0 1 0 0 0 0 0 0.2 0.1 1 0 0 1 -1 -0.1 0 0.1 0.1 1 1 1 1 0 0 0 0.1 0.1
  • 27. 27 Renas R. Rekany-Nawroz University Example The last epoch for this example according the input vectors are X1 = [0,1] and X2 = [0,1], the input set is p = [0,0;0,1;1,0;1,1] and the respective target vector is t = [0;0;0;1]. Hint: 𝛳 = 0.2, α = 0.1, W1 = 0.3 and W2 = -0.1. epoch X1 X2 Yd Yact e ΔW1 ΔW2 W1 W2 6 0 0 0 0 0 0 0 0.1 0.1 0 1 0 0 0 0 0 0.1 0.1 1 0 0 0 0 0 0 0.1 0.1 1 1 1 1 0 0 0 0.1 0.1
  • 28. 28 Renas R. Rekany-Nawroz University NOT Gate The last epoch for this example according the input vectors are X = [0,1], the input set is p = [0;1] and the respective target vector is t = [1;0]. Hint: 𝛳 = 0, α = 0.1, W = -1. epoch x Yd Yact e ΔW W1 1 0 1 1 0 0 -1 1 0 0 0 0 -1
  • 29. 29 Renas R. Rekany-Nawroz University Multi-Layer Perceptron • One or more hidden layers • Sigmoid activations functions 1st hidden layer 2nd hidden layer Output layer Input data
  • 30. 30 Renas R. Rekany-Nawroz University Structure Types of Decision Regions Result Single-Layer Two-Layer Three-Layer Half Plane Bounded By Hyperplane Convex Open Or Closed Regions Abitrary (Complexity Limited by No. of Nodes) A AB B A AB B A AB B Multi-Layer Perceptron Application
  • 31. 31 Renas R. Rekany-Nawroz University References 1. http://neuron.eng.wayne.edu/software.html Many useful example. 2. http://ieee.uow.edu.au/~daniel/software/libn eural/BPN_tutorial/BPN_English/BPN_Engli sh/BPN_English.html 3. http://www.ai-junkie.com/ 4. http://diwww.epfl.ch/mantra/tutorial/english/