SlideShare a Scribd company logo
1 of 109
Download to read offline
Artificial Neural Networks (ANNs)
XOR Step-By-Step
MENOUFIA UNIVERSITY
FACULTY OF COMPUTERS AND INFORMATION
ALL DEPARTMENTS
ARTIFICIAL INTELLIGENCE
‫المنوفية‬ ‫جامعة‬
‫والمعلومات‬ ‫الحاسبات‬ ‫كلية‬
‫األقسام‬ ‫جميع‬
‫الذكاء‬‫اإلصطناعي‬
‫المنوفية‬ ‫جامعة‬
Ahmed Fawzy Gad
ahmed.fawzy@ci.menofia.edu.eg
Classification Example
BA
01
1
10
00
0
11
Neural Networks
Input Hidden Output
BA
01
1
10
00
0
11
Neural Networks
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Neural Networks
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Can`t be Solved Linearly.
Single Layer Perceptron Can`t Work.
Use Hidden Layer.
Neural Networks
BA
01
1
10
00
0
11
Input OutputHidden
Input Layer
Input Output
A
B
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Hidden Layer
Start by Two Neurons
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
A
B
Output Layer
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
1/0
𝒀𝒋
A
B
Weights
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Weights=𝑾𝒊
A
B
1/0
𝒀𝒋
Weights
Input Layer – Hidden Layer
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Weights=𝑾𝒊
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
A
B
1/0
𝒀𝒋
Weights
Hidden Layer – Output Layer
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Weights=𝑾𝒊
𝑾 𝟓
𝑾 𝟔
A
B
1/0
𝒀𝒋
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
All Layers
Input Output
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Hidden
Weights=𝑾𝒊
𝑾 𝟓
𝑾 𝟔
1/0
𝒀𝒋
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Output
𝒀𝒋
BA
01
1
10
00
0
11
Input Hidden
1/0
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Output
𝒀𝒋
BA
01
1
10
00
0
11
Input Hidden
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Output
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Components
Output
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Inputs
Output
s
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Inputs
Output
s
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=SOP(𝑿𝒊, 𝑾𝒊)
Activation Function
Inputs
Output
s
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=SOP(𝑿𝒊, 𝑾𝒊)
Activation Function
Inputs
Output
s
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=SOP(𝑿𝒊, 𝑾𝒊)
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Inputs
Output
s
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
s=SOP(𝑿𝒊, 𝑾𝒊)
1/0
Activation Function
Inputs
Output
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
1/0
Each Hidden/Output Layer
Neuron has its SOP.
Activation Function
Inputs
Output
s
𝑿 𝟏
𝑿 𝟐
𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
1/0
Activation Function
Inputs
Output
s
𝑿 𝟏
𝑿 𝟐
𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
1/0
Activation Function
Inputs
Output
s
𝑿 𝟏
𝑿 𝟐
𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
1/0
Activation Function
Outputs
Output
F(s)s
𝑿 𝟏
𝑿 𝟐
Class Label
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Function
Outputs
Output
F(s)s
𝑿 𝟏
𝑿 𝟐
Class Label
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
Input Hidden
1/0
Each Hidden/Output
Layer Neuron has its
Activation Function.
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Activation Functions
Piecewise
Linear Sigmoid Binary
Activation Functions
Which activation function to use?
Outputs
Class
Labels
Activation
Function
TWO Class
Labels
TWO
Outputs
One that gives two outputs.
Which activation function to use?
𝑪𝒋𝒀𝒋
BA
01
1
10
00
0
11
BA
01
1 10
00
0 11
Activation Functions
Piecewise
Linear Sigmoid BinaryBinary
Activation Function
Output
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
BA
01
1
10
00
0
11
0
0.2
0.4
0.6
0.8
1
1.2
0 0.2 0.4 0.6 0.8 1 1.2
1/0
Input Hidden
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Hidden Layer Neurons
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
Bias
Output Layer Neurons
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟑
1/0
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
All Bias Values
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
1/0
+1
𝒃 𝟑
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
1/0
+1
𝒃 𝟑
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias
Add Bias to SOP
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
1/0
+1
𝒃 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
X
Y
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
X
Y y=ax+b
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
X
Y y=ax+b
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
X
Y y=ax+b
Y-Intercept
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
X
Y y=ax+b
Y-Intercept
b=0
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
X
Y
Y-Intercept
b=0
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
y=ax+b
Bias Importance
Input Output
X
Y
Y-Intercept
b=+v
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
y=ax+b
Bias Importance
Input Output
X
Y
Y-Intercept
b=+v
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
y=ax+b
Bias Importance
Input Output
X
Y
Y-Intercept
b=-v
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
y=ax+b
Bias Importance
Input Output
X
Y
Y-Intercept
b=+v
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
y=ax+b
Bias Importance
Input Output
Same Concept Applies to Bias
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Bias Importance
Input Output
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Learning Rate
𝟎 ≤ η ≤ 𝟏
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Summary of Parameters
Inputs 𝑿 𝒎
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Summary of Parameters
Weights 𝑾 𝒎
𝟎 ≤ η ≤ 𝟏
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Summary of Parameters
Bias
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Summary of Parameters
Sum Of Products (SOP) 𝒔
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Summary of Parameters
Activation Function 𝒃𝒊𝒏
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
Summary of Parameters
Outputs 𝒀𝒋
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
Summary of Parameters
Learning Rate η
𝟎 ≤ η ≤ 𝟏
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
Other Parameters
Step n
𝒏 = 𝟎, 𝟏, 𝟐, …
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
Other Parameters
Desired Output 𝒅𝒋
𝒏 = 𝟎, 𝟏, 𝟐, …
𝒅 𝒏 =
𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟏 (𝟏)
𝟎, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟐 (𝟎)
BA
01
1
10
00
0
11
F(s)s
𝑿 𝟏
𝑿 𝟐
bin
𝒀𝒋
+1
𝒃 𝟏
+1
𝒃 𝟐
1/0
+1
𝒃 𝟑
𝑾 𝟓
𝑾 𝟔
A
B
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
𝑾 𝟒
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
Neural Networks Training Steps
Weights Initialization
Inputs Application
Sum of Inputs-Weights Products
Activation Function Response Calculation
Weights Adaptation
Back to Step 2
1
2
3
4
5
6
Regarding 5th Step: Weights Adaptation
• If the predicted output Y is not the same as the desired output d,
then weights are to be adapted according to the following equation:
𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏)
Where
𝑾 𝒏 = [𝒃 𝒏 , 𝑾 𝟏(𝒏), 𝑾 𝟐(𝒏), 𝑾 𝟑(𝒏), … , 𝑾 𝒎(𝒏)]
Neural Networks
Training Example
Step n=0
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=0:
η = .001
𝑋 𝑛 = 𝑋 0 = +1, +1, +1,1, 0
𝑊 𝑛 = 𝑊 0 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6
= −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1
𝑑 𝑛 = 𝑑 0 = 1
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=0
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=0 – SOP – 𝑺 𝟏
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
=+1*-1.5+1*1+0*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – Output – 𝑺 𝟏
𝒀 𝑺 𝟏 =
= 𝑩𝑰𝑵 𝑺 𝟏
= 𝑩𝑰𝑵 −. 𝟓
= 𝟎
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – SOP – 𝑺 𝟐
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
=+1*-.5+1*1+0*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – Output – 𝑺 𝟐
𝒀 𝑺2 =
= 𝑩𝑰𝑵 𝑺2
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – SOP – 𝑺 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
=+1*-.5+0*-2+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 – Output – 𝑺 𝟑
𝒀 𝑺3 =
= 𝑩𝑰𝑵 𝑺3
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0 - Output
𝒀 𝒏 = 𝒀 𝟎 = 𝒀 𝑺3
= 1
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=0
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟎 = 1
𝐝 𝒏 = 𝒅 𝟎 = 1
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=1:
η = .001
𝑋 𝑛 = 𝑋 1 = +1, +1, +1,0, 1
𝑊 𝑛 = 𝑊 1 = 𝑊 0 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6
= −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1
𝑑 𝑛 = 𝑑 1 = +1
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=1
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=1 – SOP – 𝑺 𝟏
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
=+1*-1.5+0*1+1*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – Output – 𝑺 𝟏
𝒀 𝑺 𝟏 =
= 𝑩𝑰𝑵 𝑺 𝟏
= 𝑩𝑰𝑵 −. 𝟓
= 𝟎
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – SOP – 𝑺 𝟐
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
=+1*-.5+0*1+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – Output – 𝑺 𝟐
𝒀 𝑺2 =
= 𝑩𝑰𝑵 𝑺2
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – SOP – 𝑺 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
=+1*-.5+0*-2+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 – Output – 𝑺 𝟑
𝒀 𝑺3 =
= 𝑩𝑰𝑵 𝑺3
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1 - Output
𝒀 𝒏 = 𝒀 𝟏 = 𝒀 𝑺3
= 1
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=1
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟏 = 1
𝐝 𝒏 = 𝒅 𝟏 = 1
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−2
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=2:
η = .001
𝑋 𝑛 = 𝑋 2 = +1, +1, +1,0, 0
𝑊 𝑛 = 𝑊 2 = 𝑊 1 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6
= −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1
𝑑 𝑛 = 𝑑 2 = 0
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=2
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=2 – SOP – 𝑺 𝟏
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
=+1*-1.5+0*1+0*1
=-1.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – Output – 𝑺 𝟏
𝒀 𝑺 𝟏 =
= 𝑩𝑰𝑵 𝑺 𝟏
= 𝑩𝑰𝑵 −𝟏. 𝟓
= 𝟎
𝒃𝒊n 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – SOP – 𝑺 𝟐
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
=+1*-.5+0*1+0*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – Output – 𝑺 𝟐
𝒀 𝑺2 =
= 𝑺𝑮𝑵 𝑺2
= 𝑺𝑮𝑵 −. 𝟓
=0
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – SOP – 𝑺 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
=+1*-.5+0*-2+0*1
=-.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 – Output – 𝑺 𝟑
𝒀 𝑺3 =
= 𝑩𝑰𝑵 𝑺3
= 𝑩𝑰𝑵 −. 𝟓
= 𝟎
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2 - Output
𝒀 𝒏 = 𝒀 𝟐 = 𝒀 𝑺3
= 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=2
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟐 = 𝟎
𝐝 𝒏 = 𝒅 𝟐 = 𝟎
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=3:
η = .001
𝑋 𝑛 = 𝑋 3 = +1, +1, +1,1, 1
𝑊 𝑛 = 𝑊 3 = 𝑊 2 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6
= −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1
𝑑 𝑛 = 𝑑 3 = 0
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=3
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
BA
01
1 => 1
10
00
0 => 0
11
Neural Networks
Training Example
Step n=3 – SOP – 𝑺 𝟏
𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑)
=+1*-1.5+1*1+1*1
=.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – Output – 𝑺 𝟏
𝒀 𝑺 𝟏 =
= 𝑩𝑰𝑵 𝑺 𝟏
= 𝑩𝑰𝑵 . 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – SOP – 𝑺 𝟐
𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒)
=+1*-.5+1*1+1*1
=1.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – Output – 𝑺 𝟐
𝒀 𝑺2 =
= 𝑩𝑰𝑵 𝑺2
= 𝑩𝑰𝑵 𝟏. 𝟓
= 1
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – SOP – 𝑺 𝟑
𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔)
=+1*-.5+1*-2+1*1
=-1.5
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 – Output – 𝑺 𝟑
𝒀 𝑺3 =
= 𝑩𝑰𝑵 𝑺3
= 𝑩𝑰𝑵 −𝟏. 𝟓
= 𝟎
𝒃𝒊𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
𝟎, 𝒔 < 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3 - Output
𝒀 𝒏 = 𝒀 𝟑 = 𝒀 𝑺3
= 𝟎
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Neural Networks
Training Example
Step n=3
Predicted Vs. Desired
𝒀 𝒏 = 𝒀 𝟑 = 𝟎
𝐝 𝒏 = 𝒅 𝟑 = 𝟎
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
BA
01
1 => 1
10
00
0 => 0
11
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Final Weights
s
𝑿 𝟏
𝑿 𝟐
𝒀𝒋
+1
−𝟏. 𝟓
+1
−. 𝟓
1/0
+1
−. 𝟓
−𝟐
+𝟏
A
B
+𝟏
+𝟏
+𝟏
+𝟏
bin
Current weights predicted
the desired outputs.

More Related Content

What's hot

Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural networkSopheaktra YONG
 
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Simplilearn
 
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationArtificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationMohammed Bennamoun
 
GAN - Theory and Applications
GAN - Theory and ApplicationsGAN - Theory and Applications
GAN - Theory and ApplicationsEmanuele Ghelfi
 
Perceptron 2015.ppt
Perceptron 2015.pptPerceptron 2015.ppt
Perceptron 2015.pptSadafAyesha9
 
Uncertain Knowledge and Reasoning in Artificial Intelligence
Uncertain Knowledge and Reasoning in Artificial IntelligenceUncertain Knowledge and Reasoning in Artificial Intelligence
Uncertain Knowledge and Reasoning in Artificial IntelligenceExperfy
 
Actor critic algorithm
Actor critic algorithmActor critic algorithm
Actor critic algorithmJie-Han Chen
 
Activation function
Activation functionActivation function
Activation functionAstha Jain
 
Stuart russell and peter norvig artificial intelligence - a modern approach...
Stuart russell and peter norvig   artificial intelligence - a modern approach...Stuart russell and peter norvig   artificial intelligence - a modern approach...
Stuart russell and peter norvig artificial intelligence - a modern approach...Lê Anh Đạt
 
Activation functions
Activation functionsActivation functions
Activation functionsPRATEEK SAHU
 
Artificial intelligence- Logic Agents
Artificial intelligence- Logic AgentsArtificial intelligence- Logic Agents
Artificial intelligence- Logic AgentsNuruzzaman Milon
 
Machine learning with neural networks
Machine learning with neural networksMachine learning with neural networks
Machine learning with neural networksLet's talk about IT
 
Introduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkIntroduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkKnoldus Inc.
 
Matlab Image Restoration Techniques
Matlab Image Restoration TechniquesMatlab Image Restoration Techniques
Matlab Image Restoration TechniquesDataminingTools Inc
 
Game Playing in Artificial Intelligence
Game Playing in Artificial IntelligenceGame Playing in Artificial Intelligence
Game Playing in Artificial Intelligencelordmwesh
 
Regularization in deep learning
Regularization in deep learningRegularization in deep learning
Regularization in deep learningKien Le
 

What's hot (20)

A* Algorithm
A* AlgorithmA* Algorithm
A* Algorithm
 
Feedforward neural network
Feedforward neural networkFeedforward neural network
Feedforward neural network
 
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
Artificial Neural Network | Deep Neural Network Explained | Artificial Neural...
 
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & BackpropagationArtificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
Artificial Neural Networks Lect5: Multi-Layer Perceptron & Backpropagation
 
GAN - Theory and Applications
GAN - Theory and ApplicationsGAN - Theory and Applications
GAN - Theory and Applications
 
Perceptron 2015.ppt
Perceptron 2015.pptPerceptron 2015.ppt
Perceptron 2015.ppt
 
Uncertain Knowledge and Reasoning in Artificial Intelligence
Uncertain Knowledge and Reasoning in Artificial IntelligenceUncertain Knowledge and Reasoning in Artificial Intelligence
Uncertain Knowledge and Reasoning in Artificial Intelligence
 
Actor critic algorithm
Actor critic algorithmActor critic algorithm
Actor critic algorithm
 
Activation function
Activation functionActivation function
Activation function
 
Stuart russell and peter norvig artificial intelligence - a modern approach...
Stuart russell and peter norvig   artificial intelligence - a modern approach...Stuart russell and peter norvig   artificial intelligence - a modern approach...
Stuart russell and peter norvig artificial intelligence - a modern approach...
 
Activation functions
Activation functionsActivation functions
Activation functions
 
Artificial intelligence- Logic Agents
Artificial intelligence- Logic AgentsArtificial intelligence- Logic Agents
Artificial intelligence- Logic Agents
 
Machine learning with neural networks
Machine learning with neural networksMachine learning with neural networks
Machine learning with neural networks
 
Neural Networks: Introducton
Neural Networks: IntroductonNeural Networks: Introducton
Neural Networks: Introducton
 
Introduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkIntroduction to Recurrent Neural Network
Introduction to Recurrent Neural Network
 
Matlab Image Restoration Techniques
Matlab Image Restoration TechniquesMatlab Image Restoration Techniques
Matlab Image Restoration Techniques
 
Game Playing in Artificial Intelligence
Game Playing in Artificial IntelligenceGame Playing in Artificial Intelligence
Game Playing in Artificial Intelligence
 
Regularization in deep learning
Regularization in deep learningRegularization in deep learning
Regularization in deep learning
 
Gaussian noise
Gaussian noiseGaussian noise
Gaussian noise
 
Forward checking
Forward checkingForward checking
Forward checking
 

Similar to Artificial Neural Networks (ANNs) - XOR - Step-By-Step

Digital Electronics Fundamentals
Digital Electronics Fundamentals Digital Electronics Fundamentals
Digital Electronics Fundamentals Darwin Nesakumar
 
Neural Network Back Propagation Algorithm
Neural Network Back Propagation AlgorithmNeural Network Back Propagation Algorithm
Neural Network Back Propagation AlgorithmMartin Opdam
 
Combinational logic 2
Combinational logic 2Combinational logic 2
Combinational logic 2Heman Pathak
 
Digital-Logic40124sequential circuits logic gatepptx
Digital-Logic40124sequential circuits logic gatepptxDigital-Logic40124sequential circuits logic gatepptx
Digital-Logic40124sequential circuits logic gatepptxssuser6feece1
 
Square of an Input Number - Digital Logic Design | Lecture 5
Square of an Input Number - Digital Logic Design | Lecture 5Square of an Input Number - Digital Logic Design | Lecture 5
Square of an Input Number - Digital Logic Design | Lecture 5JalpaMaheshwari1
 
Logic gates and logic circuits
Logic gates and logic circuitsLogic gates and logic circuits
Logic gates and logic circuitsjyoti_lakhani
 
Deep learning simplified
Deep learning simplifiedDeep learning simplified
Deep learning simplifiedLovelyn Rose
 
OCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth TablesOCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth Tablesnorthernkiwi
 
Dcs lec03 - z-analysis of discrete time control systems
Dcs   lec03 - z-analysis of discrete time control systemsDcs   lec03 - z-analysis of discrete time control systems
Dcs lec03 - z-analysis of discrete time control systemsAmr E. Mohamed
 
Mba admission in india
Mba admission in indiaMba admission in india
Mba admission in indiaEdhole.com
 
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...Hsien-Hsin Sean Lee, Ph.D.
 
Computer archi&mp
Computer archi&mpComputer archi&mp
Computer archi&mpMSc CST
 
Combinational circuit (7-Segment display)
Combinational circuit (7-Segment display)Combinational circuit (7-Segment display)
Combinational circuit (7-Segment display)ali9753
 

Similar to Artificial Neural Networks (ANNs) - XOR - Step-By-Step (20)

Digital Electronics Fundamentals
Digital Electronics Fundamentals Digital Electronics Fundamentals
Digital Electronics Fundamentals
 
Neural Network Back Propagation Algorithm
Neural Network Back Propagation AlgorithmNeural Network Back Propagation Algorithm
Neural Network Back Propagation Algorithm
 
Fourier analysis presentation for thunder chasers
Fourier analysis presentation for thunder chasersFourier analysis presentation for thunder chasers
Fourier analysis presentation for thunder chasers
 
Combinational logic 2
Combinational logic 2Combinational logic 2
Combinational logic 2
 
Digital-Logic40124sequential circuits logic gatepptx
Digital-Logic40124sequential circuits logic gatepptxDigital-Logic40124sequential circuits logic gatepptx
Digital-Logic40124sequential circuits logic gatepptx
 
CH11-Digital Logic.pptx
CH11-Digital Logic.pptxCH11-Digital Logic.pptx
CH11-Digital Logic.pptx
 
Square of an Input Number - Digital Logic Design | Lecture 5
Square of an Input Number - Digital Logic Design | Lecture 5Square of an Input Number - Digital Logic Design | Lecture 5
Square of an Input Number - Digital Logic Design | Lecture 5
 
Mod 3.pptx
Mod 3.pptxMod 3.pptx
Mod 3.pptx
 
Logic circuit2017
Logic circuit2017Logic circuit2017
Logic circuit2017
 
Feedback amplifier
Feedback amplifierFeedback amplifier
Feedback amplifier
 
Logic gates and logic circuits
Logic gates and logic circuitsLogic gates and logic circuits
Logic gates and logic circuits
 
Deep learning simplified
Deep learning simplifiedDeep learning simplified
Deep learning simplified
 
Logic Equation Simplification
Logic Equation SimplificationLogic Equation Simplification
Logic Equation Simplification
 
OCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth TablesOCR GCSE Computing - Binary logic and Truth Tables
OCR GCSE Computing - Binary logic and Truth Tables
 
Logic gates presentation
Logic gates presentationLogic gates presentation
Logic gates presentation
 
Dcs lec03 - z-analysis of discrete time control systems
Dcs   lec03 - z-analysis of discrete time control systemsDcs   lec03 - z-analysis of discrete time control systems
Dcs lec03 - z-analysis of discrete time control systems
 
Mba admission in india
Mba admission in indiaMba admission in india
Mba admission in india
 
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
Lec16 Intro to Computer Engineering by Hsien-Hsin Sean Lee Georgia Tech -- Fi...
 
Computer archi&mp
Computer archi&mpComputer archi&mp
Computer archi&mp
 
Combinational circuit (7-Segment display)
Combinational circuit (7-Segment display)Combinational circuit (7-Segment display)
Combinational circuit (7-Segment display)
 

More from Ahmed Gad

ICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmAhmed Gad
 
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...Ahmed Gad
 
Python for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionPython for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionAhmed Gad
 
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Ahmed Gad
 
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesM.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesAhmed Gad
 
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Ahmed Gad
 
Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Ahmed Gad
 
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Ahmed Gad
 
Avoid Overfitting with Regularization
Avoid Overfitting with RegularizationAvoid Overfitting with Regularization
Avoid Overfitting with RegularizationAhmed Gad
 
Genetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleGenetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleAhmed Gad
 
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisAhmed Gad
 
Computer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientComputer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientAhmed Gad
 
Python for Computer Vision - Revision
Python for Computer Vision - RevisionPython for Computer Vision - Revision
Python for Computer Vision - RevisionAhmed Gad
 
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAnime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAhmed Gad
 
Operations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleOperations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleAhmed Gad
 
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingMATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingAhmed Gad
 
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...Ahmed Gad
 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Ahmed Gad
 
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Ahmed Gad
 
Introduction to Digital Signal Processing (DSP) - Course Notes
Introduction to Digital Signal Processing (DSP) - Course NotesIntroduction to Digital Signal Processing (DSP) - Course Notes
Introduction to Digital Signal Processing (DSP) - Course NotesAhmed Gad
 

More from Ahmed Gad (20)

ICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic Algorithm
 
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
 
Python for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionPython for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd Edition
 
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
 
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesM.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
 
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
 
Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)
 
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
 
Avoid Overfitting with Regularization
Avoid Overfitting with RegularizationAvoid Overfitting with Regularization
Avoid Overfitting with Regularization
 
Genetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleGenetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step Example
 
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
 
Computer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientComputer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and Gradient
 
Python for Computer Vision - Revision
Python for Computer Vision - RevisionPython for Computer Vision - Revision
Python for Computer Vision - Revision
 
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAnime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
 
Operations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleOperations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by Example
 
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingMATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
 
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...
 
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
 
Introduction to Digital Signal Processing (DSP) - Course Notes
Introduction to Digital Signal Processing (DSP) - Course NotesIntroduction to Digital Signal Processing (DSP) - Course Notes
Introduction to Digital Signal Processing (DSP) - Course Notes
 

Recently uploaded

Final demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxFinal demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxAvyJaneVismanos
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...JhezDiaz1
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for BeginnersSabitha Banu
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Capitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitolTechU
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfUjwalaBharambe
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 
MICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptxMICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptxabhijeetpadhi001
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxDr.Ibrahim Hassaan
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...jaredbarbolino94
 

Recently uploaded (20)

Final demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptxFinal demo Grade 9 for demo Plan dessert.pptx
Final demo Grade 9 for demo Plan dessert.pptx
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
ENGLISH 7_Q4_LESSON 2_ Employing a Variety of Strategies for Effective Interp...
 
OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...OS-operating systems- ch04 (Threads) ...
OS-operating systems- ch04 (Threads) ...
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
Full Stack Web Development Course for Beginners
Full Stack Web Development Course  for BeginnersFull Stack Web Development Course  for Beginners
Full Stack Web Development Course for Beginners
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Capitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptxCapitol Tech U Doctoral Presentation - April 2024.pptx
Capitol Tech U Doctoral Presentation - April 2024.pptx
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 
MICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptxMICROBIOLOGY biochemical test detailed.pptx
MICROBIOLOGY biochemical test detailed.pptx
 
Gas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptxGas measurement O2,Co2,& ph) 04/2024.pptx
Gas measurement O2,Co2,& ph) 04/2024.pptx
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...
 

Artificial Neural Networks (ANNs) - XOR - Step-By-Step

  • 1. Artificial Neural Networks (ANNs) XOR Step-By-Step MENOUFIA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATION ALL DEPARTMENTS ARTIFICIAL INTELLIGENCE ‫المنوفية‬ ‫جامعة‬ ‫والمعلومات‬ ‫الحاسبات‬ ‫كلية‬ ‫األقسام‬ ‫جميع‬ ‫الذكاء‬‫اإلصطناعي‬ ‫المنوفية‬ ‫جامعة‬ Ahmed Fawzy Gad ahmed.fawzy@ci.menofia.edu.eg
  • 3. Neural Networks Input Hidden Output BA 01 1 10 00 0 11
  • 5. Neural Networks BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Can`t be Solved Linearly. Single Layer Perceptron Can`t Work. Use Hidden Layer.
  • 8. Hidden Layer Start by Two Neurons Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden A B
  • 9. Output Layer Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden 1/0 𝒀𝒋 A B
  • 10. Weights Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=𝑾𝒊 A B 1/0 𝒀𝒋
  • 11. Weights Input Layer – Hidden Layer Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=𝑾𝒊 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 A B 1/0 𝒀𝒋
  • 12. Weights Hidden Layer – Output Layer Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=𝑾𝒊 𝑾 𝟓 𝑾 𝟔 A B 1/0 𝒀𝒋 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 13. All Layers Input Output BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Hidden Weights=𝑾𝒊 𝑾 𝟓 𝑾 𝟔 1/0 𝒀𝒋 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 14. Activation Function Output 𝒀𝒋 BA 01 1 10 00 0 11 Input Hidden 1/0 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 15. Activation Function Output 𝒀𝒋 BA 01 1 10 00 0 11 Input Hidden 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 16. Activation Function Output 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 17. Activation Function Components Output 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 18. Activation Function Inputs Output s 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 19. Activation Function Inputs Output s 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=SOP(𝑿𝒊, 𝑾𝒊)
  • 20. Activation Function Inputs Output s 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=SOP(𝑿𝒊, 𝑾𝒊)
  • 21. Activation Function Inputs Output s 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=SOP(𝑿𝒊, 𝑾𝒊)
  • 22. 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 Activation Function Inputs Output s 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 s=SOP(𝑿𝒊, 𝑾𝒊) 1/0
  • 23. Activation Function Inputs Output s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 1/0 Each Hidden/Output Layer Neuron has its SOP.
  • 24. Activation Function Inputs Output s 𝑿 𝟏 𝑿 𝟐 𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 1/0
  • 25. Activation Function Inputs Output s 𝑿 𝟏 𝑿 𝟐 𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 1/0
  • 26. Activation Function Inputs Output s 𝑿 𝟏 𝑿 𝟐 𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 1/0
  • 27. Activation Function Outputs Output F(s)s 𝑿 𝟏 𝑿 𝟐 Class Label 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 28. Activation Function Outputs Output F(s)s 𝑿 𝟏 𝑿 𝟐 Class Label 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 Input Hidden 1/0 Each Hidden/Output Layer Neuron has its Activation Function. 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 30. Activation Functions Which activation function to use? Outputs Class Labels Activation Function TWO Class Labels TWO Outputs One that gives two outputs. Which activation function to use? 𝑪𝒋𝒀𝒋 BA 01 1 10 00 0 11 BA 01 1 10 00 0 11
  • 32. Activation Function Output F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 BA 01 1 10 00 0 11 0 0.2 0.4 0.6 0.8 1 1.2 0 0.2 0.4 0.6 0.8 1 1.2 1/0 Input Hidden 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 33. Bias Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 34. Bias Hidden Layer Neurons Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0
  • 35. Bias Output Layer Neurons Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟑 1/0 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 36. All Bias Values Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 37. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) 𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) 𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 38. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 𝑺 𝟏=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) 1/0 +1 𝒃 𝟑 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 39. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 𝑺 𝟐=(𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) 1/0 +1 𝒃 𝟑 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 40. Bias Add Bias to SOP Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 𝑺 𝟑=(𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) 1/0 +1 𝒃 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 41. Bias Importance Input Output BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 42. Bias Importance Input Output X Y BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 43. Bias Importance Input Output X Y y=ax+b BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 44. Bias Importance Input Output X Y y=ax+b BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 45. Bias Importance Input Output X Y y=ax+b Y-Intercept BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 46. Bias Importance Input Output X Y y=ax+b Y-Intercept b=0 BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 47. Bias Importance Input Output X Y Y-Intercept b=0 BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 y=ax+b
  • 48. Bias Importance Input Output X Y Y-Intercept b=+v BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 y=ax+b
  • 49. Bias Importance Input Output X Y Y-Intercept b=+v BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 y=ax+b
  • 50. Bias Importance Input Output X Y Y-Intercept b=-v BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 y=ax+b
  • 51. Bias Importance Input Output X Y Y-Intercept b=+v BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 y=ax+b
  • 52. Bias Importance Input Output Same Concept Applies to Bias S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 53. Bias Importance Input Output S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 54. Bias Importance Input Output S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 55. Bias Importance Input Output S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 56. Bias Importance Input Output S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 57. Learning Rate 𝟎 ≤ η ≤ 𝟏 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 58. Summary of Parameters Inputs 𝑿 𝒎 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 59. Summary of Parameters Weights 𝑾 𝒎 𝟎 ≤ η ≤ 𝟏 W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …) 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 60. Summary of Parameters Bias 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …) F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 61. Summary of Parameters Sum Of Products (SOP) 𝒔 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …) F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 62. Summary of Parameters Activation Function 𝒃𝒊𝒏 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …) F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒
  • 63. Summary of Parameters Outputs 𝒀𝒋 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
  • 64. Summary of Parameters Learning Rate η 𝟎 ≤ η ≤ 𝟏 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
  • 65. Other Parameters Step n 𝒏 = 𝟎, 𝟏, 𝟐, … F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
  • 66. Other Parameters Desired Output 𝒅𝒋 𝒏 = 𝟎, 𝟏, 𝟐, … 𝒅 𝒏 = 𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟏 (𝟏) 𝟎, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟐 (𝟎) BA 01 1 10 00 0 11 F(s)s 𝑿 𝟏 𝑿 𝟐 bin 𝒀𝒋 +1 𝒃 𝟏 +1 𝒃 𝟐 1/0 +1 𝒃 𝟑 𝑾 𝟓 𝑾 𝟔 A B 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 𝑾 𝟒 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+…) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, …) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, …)
  • 67. Neural Networks Training Steps Weights Initialization Inputs Application Sum of Inputs-Weights Products Activation Function Response Calculation Weights Adaptation Back to Step 2 1 2 3 4 5 6
  • 68. Regarding 5th Step: Weights Adaptation • If the predicted output Y is not the same as the desired output d, then weights are to be adapted according to the following equation: 𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏) Where 𝑾 𝒏 = [𝒃 𝒏 , 𝑾 𝟏(𝒏), 𝑾 𝟐(𝒏), 𝑾 𝟑(𝒏), … , 𝑾 𝒎(𝒏)]
  • 69. Neural Networks Training Example Step n=0 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=0: η = .001 𝑋 𝑛 = 𝑋 0 = +1, +1, +1,1, 0 𝑊 𝑛 = 𝑊 0 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6 = −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1 𝑑 𝑛 = 𝑑 0 = 1 BA 01 1 => 1 10 00 0 => 0 11
  • 70. Neural Networks Training Example Step n=0 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin BA 01 1 => 1 10 00 0 => 0 11
  • 71. Neural Networks Training Example Step n=0 – SOP – 𝑺 𝟏 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) =+1*-1.5+1*1+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 72. Neural Networks Training Example Step n=0 – Output – 𝑺 𝟏 𝒀 𝑺 𝟏 = = 𝑩𝑰𝑵 𝑺 𝟏 = 𝑩𝑰𝑵 −. 𝟓 = 𝟎 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 73. Neural Networks Training Example Step n=0 – SOP – 𝑺 𝟐 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) =+1*-.5+1*1+0*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 74. Neural Networks Training Example Step n=0 – Output – 𝑺 𝟐 𝒀 𝑺2 = = 𝑩𝑰𝑵 𝑺2 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 75. Neural Networks Training Example Step n=0 – SOP – 𝑺 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) =+1*-.5+0*-2+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 76. Neural Networks Training Example Step n=0 – Output – 𝑺 𝟑 𝒀 𝑺3 = = 𝑩𝑰𝑵 𝑺3 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 77. Neural Networks Training Example Step n=0 - Output 𝒀 𝒏 = 𝒀 𝟎 = 𝒀 𝑺3 = 1 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 78. Neural Networks Training Example Step n=0 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟎 = 1 𝐝 𝒏 = 𝒅 𝟎 = 1 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 79. Neural Networks Training Example Step n=1 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=1: η = .001 𝑋 𝑛 = 𝑋 1 = +1, +1, +1,0, 1 𝑊 𝑛 = 𝑊 1 = 𝑊 0 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6 = −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1 𝑑 𝑛 = 𝑑 1 = +1 BA 01 1 => 1 10 00 0 => 0 11
  • 80. Neural Networks Training Example Step n=1 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin BA 01 1 => 1 10 00 0 => 0 11
  • 81. Neural Networks Training Example Step n=1 – SOP – 𝑺 𝟏 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) =+1*-1.5+0*1+1*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 82. Neural Networks Training Example Step n=1 – Output – 𝑺 𝟏 𝒀 𝑺 𝟏 = = 𝑩𝑰𝑵 𝑺 𝟏 = 𝑩𝑰𝑵 −. 𝟓 = 𝟎 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 83. Neural Networks Training Example Step n=1 – SOP – 𝑺 𝟐 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) =+1*-.5+0*1+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 84. Neural Networks Training Example Step n=1 – Output – 𝑺 𝟐 𝒀 𝑺2 = = 𝑩𝑰𝑵 𝑺2 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 85. Neural Networks Training Example Step n=1 – SOP – 𝑺 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) =+1*-.5+0*-2+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 86. Neural Networks Training Example Step n=1 – Output – 𝑺 𝟑 𝒀 𝑺3 = = 𝑩𝑰𝑵 𝑺3 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 87. Neural Networks Training Example Step n=1 - Output 𝒀 𝒏 = 𝒀 𝟏 = 𝒀 𝑺3 = 1 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 88. Neural Networks Training Example Step n=1 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟏 = 1 𝐝 𝒏 = 𝒅 𝟏 = 1 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −2 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 89. Neural Networks Training Example Step n=2 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=2: η = .001 𝑋 𝑛 = 𝑋 2 = +1, +1, +1,0, 0 𝑊 𝑛 = 𝑊 2 = 𝑊 1 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6 = −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1 𝑑 𝑛 = 𝑑 2 = 0 BA 01 1 => 1 10 00 0 => 0 11
  • 90. Neural Networks Training Example Step n=2 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin BA 01 1 => 1 10 00 0 => 0 11
  • 91. Neural Networks Training Example Step n=2 – SOP – 𝑺 𝟏 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) =+1*-1.5+0*1+0*1 =-1.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 92. Neural Networks Training Example Step n=2 – Output – 𝑺 𝟏 𝒀 𝑺 𝟏 = = 𝑩𝑰𝑵 𝑺 𝟏 = 𝑩𝑰𝑵 −𝟏. 𝟓 = 𝟎 𝒃𝒊n 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 93. Neural Networks Training Example Step n=2 – SOP – 𝑺 𝟐 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) =+1*-.5+0*1+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 94. Neural Networks Training Example Step n=2 – Output – 𝑺 𝟐 𝒀 𝑺2 = = 𝑺𝑮𝑵 𝑺2 = 𝑺𝑮𝑵 −. 𝟓 =0 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 95. Neural Networks Training Example Step n=2 – SOP – 𝑺 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) =+1*-.5+0*-2+0*1 =-.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 96. Neural Networks Training Example Step n=2 – Output – 𝑺 𝟑 𝒀 𝑺3 = = 𝑩𝑰𝑵 𝑺3 = 𝑩𝑰𝑵 −. 𝟓 = 𝟎 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 97. Neural Networks Training Example Step n=2 - Output 𝒀 𝒏 = 𝒀 𝟐 = 𝒀 𝑺3 = 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 98. Neural Networks Training Example Step n=2 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟐 = 𝟎 𝐝 𝒏 = 𝒅 𝟐 = 𝟎 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 99. Neural Networks Training Example Step n=3 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=3: η = .001 𝑋 𝑛 = 𝑋 3 = +1, +1, +1,1, 1 𝑊 𝑛 = 𝑊 3 = 𝑊 2 = 𝑏1, 𝑏2, 𝑏3, 𝑤1, 𝑤1, 𝑤2, 𝑤3, 𝑤4, 𝑤5, 𝑤6 = −1.5, −.5, −.5, 1, 1, 1, 1, −2, 1 𝑑 𝑛 = 𝑑 3 = 0 BA 01 1 => 1 10 00 0 => 0 11
  • 100. Neural Networks Training Example Step n=3 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin BA 01 1 => 1 10 00 0 => 0 11
  • 101. Neural Networks Training Example Step n=3 – SOP – 𝑺 𝟏 𝑺 𝟏=(+𝟏𝒃 𝟏+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟑) =+1*-1.5+1*1+1*1 =.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 102. Neural Networks Training Example Step n=3 – Output – 𝑺 𝟏 𝒀 𝑺 𝟏 = = 𝑩𝑰𝑵 𝑺 𝟏 = 𝑩𝑰𝑵 . 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 103. Neural Networks Training Example Step n=3 – SOP – 𝑺 𝟐 𝑺 𝟐=(+𝟏𝒃 𝟐+𝑿 𝟏 𝑾 𝟐+𝑿 𝟐 𝑾 𝟒) =+1*-.5+1*1+1*1 =1.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 104. Neural Networks Training Example Step n=3 – Output – 𝑺 𝟐 𝒀 𝑺2 = = 𝑩𝑰𝑵 𝑺2 = 𝑩𝑰𝑵 𝟏. 𝟓 = 1 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 105. Neural Networks Training Example Step n=3 – SOP – 𝑺 𝟑 𝑺 𝟑=(+𝟏𝒃 𝟑+𝑺 𝟏 𝑾 𝟓+𝑺 𝟐 𝑾 𝟔) =+1*-.5+1*-2+1*1 =-1.5 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 106. Neural Networks Training Example Step n=3 – Output – 𝑺 𝟑 𝒀 𝑺3 = = 𝑩𝑰𝑵 𝑺3 = 𝑩𝑰𝑵 −𝟏. 𝟓 = 𝟎 𝒃𝒊𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 𝟎, 𝒔 < 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 107. Neural Networks Training Example Step n=3 - Output 𝒀 𝒏 = 𝒀 𝟑 = 𝒀 𝑺3 = 𝟎 BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 108. Neural Networks Training Example Step n=3 Predicted Vs. Desired 𝒀 𝒏 = 𝒀 𝟑 = 𝟎 𝐝 𝒏 = 𝒅 𝟑 = 𝟎 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation BA 01 1 => 1 10 00 0 => 0 11 s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin
  • 109. Final Weights s 𝑿 𝟏 𝑿 𝟐 𝒀𝒋 +1 −𝟏. 𝟓 +1 −. 𝟓 1/0 +1 −. 𝟓 −𝟐 +𝟏 A B +𝟏 +𝟏 +𝟏 +𝟏 bin Current weights predicted the desired outputs.