SlideShare a Scribd company logo
1 of 110
Download to read offline
Artificial Neural Networks (ANNs)
Step-By-Step Training & Testing Example
MENOUFIA UNIVERSITY
FACULTY OF COMPUTERS AND INFORMATION
ALL DEPARTMENTS
ARTIFICIAL INTELLIGENCE
‫المنوفية‬ ‫جامعة‬
‫والمعلومات‬ ‫الحاسبات‬ ‫كلية‬
‫األقسام‬ ‫جميع‬
‫الذكاء‬‫اإلصطناعي‬
‫المنوفية‬ ‫جامعة‬
Ahmed Fawzy Gad
ahmed.fawzy@ci.menofia.edu.eg
Neural Networks & Classification
Linear Classifiers
Linear Classifiers
Linear Classifiers
Linear Classifiers
Complex Data
Linear Classifiers
Complex Data
Linear Classifiers
Complex Data
Not Solved Linearly
Not Solved Linearly
Nonlinear Classifiers
Nonlinear Classifiers
Training
Nonlinear Classifiers
Training
Nonlinear Classifiers
Training
Classification Example
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Neural Networks
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Hidden Output
Neural Networks
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Layer
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
Output Layer
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
𝒀𝒋
Weights
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
𝑾 𝟏
𝑾 𝟐
𝑾 𝟑
Weights=𝑾𝒊
𝒀𝒋
Activation Function
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
𝒀𝒋
Activation Function
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
𝒀𝒋
Activation Function
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
𝒀𝒋
Activation Function
Components
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
𝒀𝒋
Activation Function
Inputs
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
s
𝒀𝒋
Activation Function
Inputs
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
ss=SOP(𝑿𝒊, 𝑾𝒊)
𝒀𝒋
Activation Function
Inputs
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
ss=SOP(𝑿𝒊, 𝑾𝒊)
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝒀𝒋
Activation Function
Inputs
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
ss=SOP(𝑿𝒊, 𝑾𝒊)
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
𝒀𝒋
Activation Function
Inputs
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
ss=SOP(𝑿𝒊, 𝑾𝒊)
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑 S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
𝒀𝒋
Activation Function
Inputs
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
ss=SOP(𝑿𝒊, 𝑾𝒊)
𝑿𝒊=Inputs 𝑾𝒊=Weights
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊
𝒀𝒋
Activation Function
Outputs
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
Class Label
𝒀𝒋
Activation Functions
Piecewise
Linear Sigmoid Signum
Activation Functions
Which activation function to use?
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Outputs
Class
Labels
Activation
Function
TWO Class
Labels
TWO
Outputs
One that gives two outputs.
B (BLUE)G (GREEN)R (RED)
00255
RED 6880248
25500
BLUE 2101567
Which activation function to use?
𝑪𝒋𝒀𝒋
Activation Functions
Piecewise
Linear Sigmoid SignumSignum
Activation Function
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
Bias
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
Bias
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
Bias
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
𝑿 𝟎 = +𝟏
Bias
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(+𝟏 ∗ 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
Bias
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
X
Y
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
X
Y y=x+b
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
X
Y y=x+b
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
X
Y y=x+b
Y-Intercept
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
X
Y y=x+b
Y-Intercept
b=0
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
X
Y y=x+b
Y-Intercept
b=0
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
X
Y y=x+b
Y-Intercept
b=+v
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
X
Y y=x+b
Y-Intercept
b=+v
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
X
Y y=x+b
Y-Intercept
b=-v
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
X
Y y=x+b
Y-Intercept
b=+v
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
Same Concept Applies to Bias
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
Bias Importance
B (BLUE)G (GREEN)R (RED)
00255
RED
6880248
25500
BLUE
2101567
Input Output
R
G
B
RED/BLUE
W1
W2
W3
F(s)s
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
S= 𝟏
𝒎
𝑿𝒊 𝑾𝒊+BIAS
Learning Rate
R
G
B
RED/BLUE
W1
W2
W3
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
𝟎 ≤ η ≤ 𝟏
Summary of Parameters
Inputs 𝑿 𝒎
R
G
B
RED/BLUE
W1
W2
W3
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑)
Summary of Parameters
Weights 𝑾 𝒎
R
G
B
RED/BLUE
W1
W2
W3
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
𝟎 ≤ η ≤ 𝟏
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑)
Summary of Parameters
Bias 𝒃
R
G
B
RED/BLUE
W1
W2
W3
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
Summary of Parameters
Sum Of Products (SOP) 𝒔
R
G
B
RED/BLUE
W1
W2
W3
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
Summary of Parameters
Activation Function 𝒔𝒈𝒏
R
G
B
RED/BLUE
W1
W2
W3
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
Summary of Parameters
Outputs 𝒀𝒋
R
G
B
RED/BLUE
W1
W2
W3
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
Summary of Parameters
Learning Rate η
R
G
B
RED/BLUE
W1
W2
W3
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
𝟎 ≤ η ≤ 𝟏
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
Other Parameters
Step n
R
G
B
RED/BLUE
W1
W2
W3
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
𝟎 ≤ η ≤ 𝟏𝒏 = 𝟎, 𝟏, 𝟐, …
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
Other Parameters
Desired Output 𝒅𝒋
R
G
B
RED/BLUE
W1
W2
W3
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
𝒀𝒋
=+1𝑿 𝟎
W0
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
𝟎 ≤ η ≤ 𝟏𝒏 = 𝟎, 𝟏, 𝟐, …
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
𝒅 𝒏 =
−𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟏 (𝑹𝑬𝑫)
+𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟐 (𝑩𝑳𝑼𝑬)
𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑)
W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
Neural Networks Training Steps
Weights Initialization
Inputs Application
Sum of Inputs-Weights Products
Activation Function Response Calculation
Weights Adaptation
Back to Step 2
1
2
3
4
5
6
Regarding 5th Step: Weights Adaptation
• If the predicted output Y is not the same as the desired output d,
then weights are to be adapted according to the following equation:
𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏)
Where
𝑾 𝒏 = [𝒃 𝒏 , 𝑾 𝟏(𝒏), 𝑾 𝟐(𝒏), 𝑾 𝟑(𝒏), … , 𝑾 𝒎(𝒏)]
Neural Networks
Training Example
Step n=0
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=0:
η = .001
𝑋 𝑛 = 𝑋 0 = +1, 255, 0, 0
𝑊 𝑛 = 𝑊 0 = −1, −2, 1, 6.2
𝑑 𝑛 = 𝑑 0 = −1
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Neural Networks
Training Example
Step n=0
255
0
0
-2
1
6.2
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
=+1𝑿 𝟎
-1
RED/BLUE
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Neural Networks
Training Example
Step n=0 - SOP
255
0
0
-2
1
6.2
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
=+1𝑿 𝟎
-1
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
=+1*-1+255*-2+0*1+0*6.2
=-511
RED/BLUE
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Neural Networks
Training Example
Step n=0 - Output
255
0
0
-2
1
6.2
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
-511
=+1𝑿 𝟎
-1
𝒀 𝒏 = 𝒀 𝟎
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 −𝟓𝟏𝟏
= −𝟏
sgn
RED/BLUE
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
Neural Networks
Training Example
Step n=0 - Output
255
0
0
-2
1
6.2
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
-511 −𝟏
=+1𝑿 𝟎
-1
𝒀 𝒏 = 𝒀 𝟎
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 −𝟓𝟏𝟏
= −𝟏
-1
RED
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Neural Networks
Training Example
Step n=0
Predicted Vs. Desired
255
0
0
-2
1
6.2
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
-511 −𝟏
=+1𝑿 𝟎
-1
-1
𝒀 𝒏 = 𝒀 𝟎 = −𝟏
𝐝 𝒏 = 𝒅 𝟎 = −𝟏
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Neural Networks
Training Example
Step n=1
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=1:
η = .001
𝑋 𝑛 = 𝑋 1 = +1, 248, 80, 68
𝑊 𝑛 = 𝑊 1 = 𝑊 0 = −1, −2, 1, 6.2
𝑑 𝑛 = 𝑑 1 = −1
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Neural Networks
Training Example
Step n=1
248
80
68
RED/BLUE
-2
1
6.2
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
=+1𝑿 𝟎
-1
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Neural Networks
Training Example
Step n=1 - SOP
248
80
68
RED/BLUE
-2
1
6.2
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
=+1𝑿 𝟎
-1
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
=+1*-1+248*-2+80*1+68*6.2
=4.6
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Neural Networks
Training Example
Step n=1 - Output
248
80
68
-2
1
6.2
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
4.6
=+1𝑿 𝟎
-1
𝒀 𝒏 = 𝒀 𝟏
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 𝟒. 𝟔
= +𝟏
sgn
RED/BLUE
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
Neural Networks
Training Example
Step n=1 - Output
248
80
68
-2
1
6.2
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
4.6 +𝟏
=+1𝑿 𝟎
-1
𝒀 𝒏 = 𝒀 𝟏
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 𝟒. 𝟔
= +𝟏
+1
BLUE
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Neural Networks
Training Example
Step n=1
Predicted Vs. Desired
248
80
68
-2
1
6.2
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
4.6 +𝟏
=+1𝑿 𝟎
-1
+1
𝒀 𝒏 = 𝒀 𝟏 = +𝟏
𝐝 𝒏 = 𝒅 𝟏 = −𝟏
∵ 𝒀 𝒏 ≠ 𝒅 𝒏
∴ Weights are Incorrect.
Adaptation Required
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Weights Adaptation
• According to
𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏)
• Where n = 1
𝑾 𝟏 + 𝟏 = 𝑾 𝟏 + η 𝒅 𝟏 − 𝒀 𝟏 𝑿(𝟏)
𝑾 𝟐 = −𝟏, −𝟐, 𝟏, 𝟔. 𝟐 + .001 −𝟏 − (+𝟏) +𝟏, 𝟐𝟒𝟖, 𝟖𝟎, 𝟔𝟖
𝑾 𝟐 = −𝟏, −𝟐, 𝟏, 𝟔. 𝟐 + .001 −𝟐 +𝟏, 𝟐𝟒𝟖, 𝟖𝟎, 𝟔𝟖
𝑾 𝟐 = −𝟏, −𝟐, 𝟏, 𝟔. 𝟐 + −. 𝟎𝟎𝟐 +𝟏, 𝟐𝟒𝟖, 𝟖𝟎, 𝟔𝟖
𝑾 𝟐 = −𝟏, −𝟐, 𝟏, 𝟔. 𝟐 + −. 𝟎𝟎𝟐, −. 𝟒𝟗𝟔, −. 𝟏𝟔, −. 𝟏𝟑𝟔
𝑾 𝟐 = −𝟏. 𝟎𝟎𝟐, −𝟐. 𝟒𝟗𝟔, . 𝟖𝟒, 𝟔. 𝟎𝟔𝟒
Neural Networks
Training Example
Step n=2
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=2:
η = .001
𝑋 𝑛 = 𝑋 2 = +1, 0, 0, 255
𝑊 𝑛 = 𝑊 2 = −1.002, −2.496, .84, 6.064
𝑑 𝑛 = 𝑑 2 = +1
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Neural Networks
Training Example
Step n=2
0
0
255
RED/BLUE
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
=+1𝑿 𝟎
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
Neural Networks
Training Example
Step n=2 - SOP
0
0
255
RED/BLUE
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
=+1𝑿 𝟎
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
=+1*-1.002+0*-
2.496+0*.84+255*6.064
=1545.32
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
Neural Networks
Training Example
Step n=2 - Output
0
0
255
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
1545
.32
=+1𝑿 𝟎
𝒀 𝒏 = 𝒀 𝟐
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 1545.32
= +𝟏
sgn
RED/BLUE
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
Neural Networks
Training Example
Step n=2 - Output
0
0
255
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
1545
.32 +𝟏
=+1𝑿 𝟎
𝒀 𝒏 = 𝒀 𝟐
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 1545.32
= +𝟏
+1
BLUE
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
Neural Networks
Training Example
Step n=2
Predicted Vs. Desired
0
0
255
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
1545
.32 +𝟏
=+1𝑿 𝟎
+1
𝒀 𝒏 = 𝒀 𝟐 = +𝟏
𝐝 𝒏 = 𝒅 𝟐 = +𝟏
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
Neural Networks
Training Example
Step n=3
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=3:
η = .001
𝑋 𝑛 = 𝑋 3 = +1, 67, 15, 210
𝑊 𝑛 = 𝑊 3 = 𝑊 2 = −1.002, −2.496, .84, 6.064
𝑑 𝑛 = 𝑑 3 = +1
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Neural Networks
Training Example
Step n=3
67
15
210
RED/BLUE
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
=+1𝑿 𝟎
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
Neural Networks
Training Example
Step n=3 - SOP
67
15
210
RED/BLUE
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
=+1𝑿 𝟎
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
=+1*-1.002+67*-
2.496+15*.84+210*6.064
=1349.542
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
Neural Networks
Training Example
Step n=3 - Output
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
1349
.542
=+1𝑿 𝟎
𝒀 𝒏 = 𝒀 𝟑
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 1349.542
= +𝟏
sgn
RED/BLUE
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
67
15
210
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
Neural Networks
Training Example
Step n=3 - Output
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
1349
.542 +𝟏
=+1𝑿 𝟎
𝒀 𝒏 = 𝒀 𝟑
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 1349.542
= +𝟏
+1
BLUE
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
67
15
210
Neural Networks
Training Example
Step n=3
Predicted Vs. Desired
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
1349
.542 +𝟏
=+1𝑿 𝟎
+1
𝒀 𝒏 = 𝒀 𝟑 = +𝟏
𝐝 𝒏 = 𝒅 𝟑 = +𝟏
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
67
15
210
Neural Networks
Training Example
Step n=4
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=4:
η = .001
𝑋 𝑛 = 𝑋 4 = +1, 255, 0, 0
𝑊 𝑛 = 𝑊 4 = 𝑊 3 = −1.002, −2.496, .84, 6.064
𝑑 𝑛 = 𝑑 4 = −1
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Neural Networks
Training Example
Step n=4
255
0
0
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
=+1𝑿 𝟎
RED/BLUE
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
Neural Networks
Training Example
Step n=4 - SOP
255
0
0
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
=+1𝑿 𝟎
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
=+1*-1.002+255*-
2.496+0*.84+0*6.064
=-637.482
RED/BLUE
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
Neural Networks
Training Example
Step n=4 - Output
255
0
0
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
-
637.
482
=+1𝑿 𝟎
𝒀 𝒏 = 𝒀 𝟒
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 −637.482
= −𝟏
sgn
RED/BLUE
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
Neural Networks
Training Example
Step n=4 - Output
255
0
0
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
-
637.
482
−𝟏
=+1𝑿 𝟎
𝒀 𝒏 = 𝒀 𝟒
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 −637.482
= −𝟏
-1
RED
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
Neural Networks
Training Example
Step n=4
Predicted Vs. Desired
255
0
0
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
-
637.
482
−𝟏
=+1𝑿 𝟎
-1
𝒀 𝒏 = 𝒀 𝟒 = −𝟏
𝐝 𝒏 = 𝒅 𝟒 = −𝟏
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
Neural Networks
Training Example
Step n=5
• In each step in the solution, the parameters of the neural network
must be known.
• Parameters of step n=5:
η = .001
𝑋 𝑛 = 𝑋 5 = +1, 248, 80, 68
𝑊 𝑛 = 𝑊 5 = 𝑊 4 = −1.002, −2.496, .84, 6.064
𝑑 𝑛 = 𝑑 5 = −1
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
Neural Networks
Training Example
Step n=5
248
80
68
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
=+1𝑿 𝟎
RED/BLUE
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
Neural Networks
Training Example
Step n=5 - SOP
248
80
68
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
=+1𝑿 𝟎
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
=+1*-1.002+248*-
2.496+80*.84+68*6.064
=-31.306
RED/BLUE
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
Neural Networks
Training Example
Step n=5 - Output
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
-
31.3
06
=+1𝑿 𝟎
𝒀 𝒏 = 𝒀 𝟓
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 −31.306
= −𝟏
sgn
RED/BLUE
𝒀(𝒏)
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
248
80
68
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
Neural Networks
Training Example
Step n=5 - Output
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
-
31.3
06
−𝟏
=+1𝑿 𝟎
𝒀 𝒏 = 𝒀 𝟓
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 −31.306
= −𝟏
-1
RED
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
248
80
68
Neural Networks
Training Example
Step n=5
Predicted Vs. Desired
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
-
637.
482
−𝟏
=+1𝑿 𝟎
-1
𝒀 𝒏 = 𝒀 𝟓 = −𝟏
𝐝 𝒏 = 𝒅 𝟓 = −𝟏
∵ 𝒀 𝒏 = 𝒅 𝒏
∴ Weights are Correct.
No Adaptation
B (BLUE)G (GREEN)R (RED)
00255
RED = -1
6880248
25500
BLUE = +1
2101567
6.064
.84
-2.496
-1.002
248
80
68
Correct Weights
• After testing the weights across all samples and results were correct
then we can conclude that current weights are correct ones for
training the neural network.
• After training phase we come to testing the neural network.
• What is the class of the unknown color of values of R=150, G=100,
B=180?
Testing Trained Neural Network
(R, G, B) = (150, 100, 180)
Trained Neural Networks Parameters
η = .001
𝑊 = −1.002, −2.496, .84, 6.064
Testing Trained Neural Network
(R, G, B) = (150, 100, 180)
SOP
150
100
180
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
F(s)s sgn
=+1𝑿 𝟎
s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
=+1*-1.002+150*-
2.496+100*.84+180*6.064
=800.118
RED/BLUE
𝒀
6.064
.84
-2.496
-1.002
Testing Trained Neural Network
(R, G, B) = (150, 100, 180)
Output
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
800.
118
=+1𝑿 𝟎
𝒀
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 800.118
= +𝟏
sgn
RED/BLUE
𝒀
6.064
.84
-2.496
-1.002
150
100
180
𝒔𝒈𝒏 𝒔 =
+𝟏, 𝒔 ≥ 𝟎
−𝟏, 𝒔 < 𝟎
Testing Trained Neural Network
(R, G, B) = (150, 100, 180)
Output
𝑿 𝟏
𝑿 𝟐
𝑿 𝟑
800.
118 +𝟏
=+1𝑿 𝟎
𝒀
= 𝑺𝑮𝑵 𝒔
= 𝑺𝑮𝑵 800.118
= +𝟏
+1
BLUE
6.064
.84
-2.496
-1.002
150
100
180

More Related Content

What's hot

Deep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural NetworksDeep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural NetworksChristian Perone
 
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Ahmed Gad
 
GAN - Theory and Applications
GAN - Theory and ApplicationsGAN - Theory and Applications
GAN - Theory and ApplicationsEmanuele Ghelfi
 
Classification Algorithm.
Classification Algorithm.Classification Algorithm.
Classification Algorithm.Megha Sharma
 
Dimensionality Reduction
Dimensionality ReductionDimensionality Reduction
Dimensionality Reductionmrizwan969
 
Pattern recognition and Machine Learning.
Pattern recognition and Machine Learning.Pattern recognition and Machine Learning.
Pattern recognition and Machine Learning.Rohit Kumar
 
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...Simplilearn
 
NAIVE BAYES CLASSIFIER
NAIVE BAYES CLASSIFIERNAIVE BAYES CLASSIFIER
NAIVE BAYES CLASSIFIERKnoldus Inc.
 
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...Simplilearn
 
Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...
Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...
Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...Simplilearn
 
Variational Autoencoder
Variational AutoencoderVariational Autoencoder
Variational AutoencoderMark Chang
 
Neural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronNeural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronMostafa G. M. Mostafa
 
Semantic Segmentation - Fully Convolutional Networks for Semantic Segmentation
Semantic Segmentation - Fully Convolutional Networks for Semantic SegmentationSemantic Segmentation - Fully Convolutional Networks for Semantic Segmentation
Semantic Segmentation - Fully Convolutional Networks for Semantic Segmentation岳華 杜
 
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)Fellowship at Vodafone FutureLab
 
Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Suraj Aavula
 
Reinforcement learning
Reinforcement learning Reinforcement learning
Reinforcement learning Chandra Meena
 

What's hot (20)

CNN Tutorial
CNN TutorialCNN Tutorial
CNN Tutorial
 
Deep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural NetworksDeep Learning - Convolutional Neural Networks
Deep Learning - Convolutional Neural Networks
 
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...Derivation of Convolutional Neural Network from Fully Connected Network Step-...
Derivation of Convolutional Neural Network from Fully Connected Network Step-...
 
GAN - Theory and Applications
GAN - Theory and ApplicationsGAN - Theory and Applications
GAN - Theory and Applications
 
Classification Algorithm.
Classification Algorithm.Classification Algorithm.
Classification Algorithm.
 
Dimensionality Reduction
Dimensionality ReductionDimensionality Reduction
Dimensionality Reduction
 
03 Machine Learning Linear Algebra
03 Machine Learning Linear Algebra03 Machine Learning Linear Algebra
03 Machine Learning Linear Algebra
 
Pattern recognition and Machine Learning.
Pattern recognition and Machine Learning.Pattern recognition and Machine Learning.
Pattern recognition and Machine Learning.
 
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
 
NAIVE BAYES CLASSIFIER
NAIVE BAYES CLASSIFIERNAIVE BAYES CLASSIFIER
NAIVE BAYES CLASSIFIER
 
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
Machine Learning Tutorial Part - 1 | Machine Learning Tutorial For Beginners ...
 
Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...
Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...
Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...
 
Variational Autoencoder
Variational AutoencoderVariational Autoencoder
Variational Autoencoder
 
Cnn
CnnCnn
Cnn
 
Neural Networks: Multilayer Perceptron
Neural Networks: Multilayer PerceptronNeural Networks: Multilayer Perceptron
Neural Networks: Multilayer Perceptron
 
Semantic Segmentation - Fully Convolutional Networks for Semantic Segmentation
Semantic Segmentation - Fully Convolutional Networks for Semantic SegmentationSemantic Segmentation - Fully Convolutional Networks for Semantic Segmentation
Semantic Segmentation - Fully Convolutional Networks for Semantic Segmentation
 
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
AlexNet(ImageNet Classification with Deep Convolutional Neural Networks)
 
Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Convolution Neural Network (CNN)
Convolution Neural Network (CNN)
 
LSTM
LSTMLSTM
LSTM
 
Reinforcement learning
Reinforcement learning Reinforcement learning
Reinforcement learning
 

Similar to Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & Testing Example

Sbe final exam jan17 - solved-converted
Sbe final exam jan17 - solved-convertedSbe final exam jan17 - solved-converted
Sbe final exam jan17 - solved-convertedcairo university
 
Relational algebra
Relational algebraRelational algebra
Relational algebrashynajain
 
Ray Marching Explained
Ray Marching ExplainedRay Marching Explained
Ray Marching ExplainedMårten Rånge
 
Agilent ADS 模擬手冊 [實習2] 放大器設計
Agilent ADS 模擬手冊 [實習2]  放大器設計Agilent ADS 模擬手冊 [實習2]  放大器設計
Agilent ADS 模擬手冊 [實習2] 放大器設計Simen Li
 
Complex differentiation contains analytic function.pptx
Complex differentiation contains analytic function.pptxComplex differentiation contains analytic function.pptx
Complex differentiation contains analytic function.pptxjyotidighole2
 
声質変換の概要と最新手法の紹介
声質変換の概要と最新手法の紹介声質変換の概要と最新手法の紹介
声質変換の概要と最新手法の紹介Kentaro Tachibana
 
Control system block diagram reduction in easy way
Control system block diagram reduction in easy wayControl system block diagram reduction in easy way
Control system block diagram reduction in easy wayMoumita Saha
 
IRJET- Global Accurate Domination in Jump Graph
IRJET- Global Accurate Domination in Jump GraphIRJET- Global Accurate Domination in Jump Graph
IRJET- Global Accurate Domination in Jump GraphIRJET Journal
 
The inGraph project and incremental evaluation of Cypher queries
The inGraph project and incremental evaluation of Cypher queriesThe inGraph project and incremental evaluation of Cypher queries
The inGraph project and incremental evaluation of Cypher queriesopenCypher
 
ゲーム理論BASIC 演習51 -完全ベイジアン均衡-
ゲーム理論BASIC 演習51 -完全ベイジアン均衡-ゲーム理論BASIC 演習51 -完全ベイジアン均衡-
ゲーム理論BASIC 演習51 -完全ベイジアン均衡-ssusere0a682
 
Product Rules & Amp Laplacian 1
Product Rules & Amp Laplacian 1Product Rules & Amp Laplacian 1
Product Rules & Amp Laplacian 1NumanUsama
 
Abstract Meaning Representation
Abstract Meaning RepresentationAbstract Meaning Representation
Abstract Meaning RepresentationJinho Choi
 
09.sdcd_lugar_geometrico_raices
09.sdcd_lugar_geometrico_raices09.sdcd_lugar_geometrico_raices
09.sdcd_lugar_geometrico_raicesHipólito Aguilar
 
Solutions manual for fundamentals of aerodynamics 6th edition by anderson
Solutions manual for fundamentals of aerodynamics 6th edition by andersonSolutions manual for fundamentals of aerodynamics 6th edition by anderson
Solutions manual for fundamentals of aerodynamics 6th edition by andersonfivesu
 

Similar to Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & Testing Example (20)

Sbe final exam jan17 - solved-converted
Sbe final exam jan17 - solved-convertedSbe final exam jan17 - solved-converted
Sbe final exam jan17 - solved-converted
 
Relational algebra
Relational algebraRelational algebra
Relational algebra
 
Ray Marching Explained
Ray Marching ExplainedRay Marching Explained
Ray Marching Explained
 
Two port-networks
Two port-networksTwo port-networks
Two port-networks
 
set theory --.pptx
set theory --.pptxset theory --.pptx
set theory --.pptx
 
Agilent ADS 模擬手冊 [實習2] 放大器設計
Agilent ADS 模擬手冊 [實習2]  放大器設計Agilent ADS 模擬手冊 [實習2]  放大器設計
Agilent ADS 模擬手冊 [實習2] 放大器設計
 
Complex differentiation contains analytic function.pptx
Complex differentiation contains analytic function.pptxComplex differentiation contains analytic function.pptx
Complex differentiation contains analytic function.pptx
 
声質変換の概要と最新手法の紹介
声質変換の概要と最新手法の紹介声質変換の概要と最新手法の紹介
声質変換の概要と最新手法の紹介
 
Control system block diagram reduction in easy way
Control system block diagram reduction in easy wayControl system block diagram reduction in easy way
Control system block diagram reduction in easy way
 
IRJET- Global Accurate Domination in Jump Graph
IRJET- Global Accurate Domination in Jump GraphIRJET- Global Accurate Domination in Jump Graph
IRJET- Global Accurate Domination in Jump Graph
 
K to 12 math
K to 12 mathK to 12 math
K to 12 math
 
The inGraph project and incremental evaluation of Cypher queries
The inGraph project and incremental evaluation of Cypher queriesThe inGraph project and incremental evaluation of Cypher queries
The inGraph project and incremental evaluation of Cypher queries
 
ゲーム理論BASIC 演習51 -完全ベイジアン均衡-
ゲーム理論BASIC 演習51 -完全ベイジアン均衡-ゲーム理論BASIC 演習51 -完全ベイジアン均衡-
ゲーム理論BASIC 演習51 -完全ベイジアン均衡-
 
Product Rules & Amp Laplacian 1
Product Rules & Amp Laplacian 1Product Rules & Amp Laplacian 1
Product Rules & Amp Laplacian 1
 
Abstract Meaning Representation
Abstract Meaning RepresentationAbstract Meaning Representation
Abstract Meaning Representation
 
bode_plot By DEV
 bode_plot By DEV bode_plot By DEV
bode_plot By DEV
 
Digital Notes
Digital NotesDigital Notes
Digital Notes
 
09.sdcd_lugar_geometrico_raices
09.sdcd_lugar_geometrico_raices09.sdcd_lugar_geometrico_raices
09.sdcd_lugar_geometrico_raices
 
Solutions manual for fundamentals of aerodynamics 6th edition by anderson
Solutions manual for fundamentals of aerodynamics 6th edition by andersonSolutions manual for fundamentals of aerodynamics 6th edition by anderson
Solutions manual for fundamentals of aerodynamics 6th edition by anderson
 
Mod 3.pptx
Mod 3.pptxMod 3.pptx
Mod 3.pptx
 

More from Ahmed Gad

ICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmAhmed Gad
 
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...Ahmed Gad
 
Python for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionPython for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionAhmed Gad
 
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Ahmed Gad
 
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesM.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesAhmed Gad
 
Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Ahmed Gad
 
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Ahmed Gad
 
Avoid Overfitting with Regularization
Avoid Overfitting with RegularizationAvoid Overfitting with Regularization
Avoid Overfitting with RegularizationAhmed Gad
 
Genetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleGenetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleAhmed Gad
 
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisAhmed Gad
 
Computer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientComputer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientAhmed Gad
 
Python for Computer Vision - Revision
Python for Computer Vision - RevisionPython for Computer Vision - Revision
Python for Computer Vision - RevisionAhmed Gad
 
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAnime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAhmed Gad
 
Brief Introduction to Deep Learning + Solving XOR using ANNs
Brief Introduction to Deep Learning + Solving XOR using ANNsBrief Introduction to Deep Learning + Solving XOR using ANNs
Brief Introduction to Deep Learning + Solving XOR using ANNsAhmed Gad
 
Operations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleOperations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleAhmed Gad
 
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingMATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingAhmed Gad
 
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...Ahmed Gad
 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Ahmed Gad
 
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Ahmed Gad
 
Introduction to Digital Signal Processing (DSP) - Course Notes
Introduction to Digital Signal Processing (DSP) - Course NotesIntroduction to Digital Signal Processing (DSP) - Course Notes
Introduction to Digital Signal Processing (DSP) - Course NotesAhmed Gad
 

More from Ahmed Gad (20)

ICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic AlgorithmICEIT'20 Cython for Speeding-up Genetic Algorithm
ICEIT'20 Cython for Speeding-up Genetic Algorithm
 
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
NumPyCNNAndroid: A Library for Straightforward Implementation of Convolutiona...
 
Python for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd EditionPython for Computer Vision - Revision 2nd Edition
Python for Computer Vision - Revision 2nd Edition
 
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
Multi-Objective Optimization using Non-Dominated Sorting Genetic Algorithm wi...
 
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded ScenesM.Sc. Thesis - Automatic People Counting in Crowded Scenes
M.Sc. Thesis - Automatic People Counting in Crowded Scenes
 
Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)Introduction to Optimization with Genetic Algorithm (GA)
Introduction to Optimization with Genetic Algorithm (GA)
 
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
Derivation of Convolutional Neural Network (ConvNet) from Fully Connected Net...
 
Avoid Overfitting with Regularization
Avoid Overfitting with RegularizationAvoid Overfitting with Regularization
Avoid Overfitting with Regularization
 
Genetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step ExampleGenetic Algorithm (GA) Optimization - Step-by-Step Example
Genetic Algorithm (GA) Optimization - Step-by-Step Example
 
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression AnalysisICCES 2017 - Crowd Density Estimation Method using Regression Analysis
ICCES 2017 - Crowd Density Estimation Method using Regression Analysis
 
Computer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and GradientComputer Vision: Correlation, Convolution, and Gradient
Computer Vision: Correlation, Convolution, and Gradient
 
Python for Computer Vision - Revision
Python for Computer Vision - RevisionPython for Computer Vision - Revision
Python for Computer Vision - Revision
 
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia CourseAnime Studio Pro 10 Tutorial as Part of Multimedia Course
Anime Studio Pro 10 Tutorial as Part of Multimedia Course
 
Brief Introduction to Deep Learning + Solving XOR using ANNs
Brief Introduction to Deep Learning + Solving XOR using ANNsBrief Introduction to Deep Learning + Solving XOR using ANNs
Brief Introduction to Deep Learning + Solving XOR using ANNs
 
Operations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by ExampleOperations in Digital Image Processing + Convolution by Example
Operations in Digital Image Processing + Convolution by Example
 
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and TrackingMATLAB Code + Description : Real-Time Object Motion Detection and Tracking
MATLAB Code + Description : Real-Time Object Motion Detection and Tracking
 
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...MATLAB Code + Description : Very Simple Automatic English Optical Character R...
MATLAB Code + Description : Very Simple Automatic English Optical Character R...
 
Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...Graduation Project - Face Login : A Robust Face Identification System for Sec...
Graduation Project - Face Login : A Robust Face Identification System for Sec...
 
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
Introduction to MATrices LABoratory (MATLAB) as Part of Digital Signal Proces...
 
Introduction to Digital Signal Processing (DSP) - Course Notes
Introduction to Digital Signal Processing (DSP) - Course NotesIntroduction to Digital Signal Processing (DSP) - Course Notes
Introduction to Digital Signal Processing (DSP) - Course Notes
 

Recently uploaded

Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfSpandanaRallapalli
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfUjwalaBharambe
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...Nguyen Thanh Tu Collection
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersSabitha Banu
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptxSherlyMaeNeri
 
ROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationAadityaSharma884161
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Jisc
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfMr Bounab Samir
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 
Romantic Opera MUSIC FOR GRADE NINE pptx
Romantic Opera MUSIC FOR GRADE NINE pptxRomantic Opera MUSIC FOR GRADE NINE pptx
Romantic Opera MUSIC FOR GRADE NINE pptxsqpmdrvczh
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 

Recently uploaded (20)

Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
ACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdfACC 2024 Chronicles. Cardiology. Exam.pdf
ACC 2024 Chronicles. Cardiology. Exam.pdf
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdfFraming an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
Framing an Appropriate Research Question 6b9b26d93da94caf993c038d9efcdedb.pdf
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
HỌC TỐT TIẾNG ANH 11 THEO CHƯƠNG TRÌNH GLOBAL SUCCESS ĐÁP ÁN CHI TIẾT - CẢ NĂ...
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginners
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptx
 
ROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint PresentationROOT CAUSE ANALYSIS PowerPoint Presentation
ROOT CAUSE ANALYSIS PowerPoint Presentation
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
Rapple "Scholarly Communications and the Sustainable Development Goals"
Rapple "Scholarly Communications and the Sustainable Development Goals"Rapple "Scholarly Communications and the Sustainable Development Goals"
Rapple "Scholarly Communications and the Sustainable Development Goals"
 
Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...Procuring digital preservation CAN be quick and painless with our new dynamic...
Procuring digital preservation CAN be quick and painless with our new dynamic...
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdfLike-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
Like-prefer-love -hate+verb+ing & silent letters & citizenship text.pdf
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 
Romantic Opera MUSIC FOR GRADE NINE pptx
Romantic Opera MUSIC FOR GRADE NINE pptxRomantic Opera MUSIC FOR GRADE NINE pptx
Romantic Opera MUSIC FOR GRADE NINE pptx
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 

Introduction to Artificial Neural Networks (ANNs) - Step-by-Step Training & Testing Example

  • 1. Artificial Neural Networks (ANNs) Step-By-Step Training & Testing Example MENOUFIA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATION ALL DEPARTMENTS ARTIFICIAL INTELLIGENCE ‫المنوفية‬ ‫جامعة‬ ‫والمعلومات‬ ‫الحاسبات‬ ‫كلية‬ ‫األقسام‬ ‫جميع‬ ‫الذكاء‬‫اإلصطناعي‬ ‫المنوفية‬ ‫جامعة‬ Ahmed Fawzy Gad ahmed.fawzy@ci.menofia.edu.eg
  • 2. Neural Networks & Classification
  • 15. Classification Example B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567
  • 16. Neural Networks B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Hidden Output
  • 17. Neural Networks B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567
  • 18. Input Layer B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B
  • 19. Output Layer B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE 𝒀𝒋
  • 20. Weights B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE 𝑾 𝟏 𝑾 𝟐 𝑾 𝟑 Weights=𝑾𝒊 𝒀𝒋
  • 21. Activation Function B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 𝒀𝒋
  • 22. Activation Function B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 𝒀𝒋
  • 23. Activation Function B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 𝒀𝒋
  • 24. Activation Function Components B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 𝒀𝒋
  • 25. Activation Function Inputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 s 𝒀𝒋
  • 26. Activation Function Inputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 ss=SOP(𝑿𝒊, 𝑾𝒊) 𝒀𝒋
  • 27. Activation Function Inputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 ss=SOP(𝑿𝒊, 𝑾𝒊) 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝒀𝒋
  • 28. Activation Function Inputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 ss=SOP(𝑿𝒊, 𝑾𝒊) 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 𝒀𝒋
  • 29. Activation Function Inputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 ss=SOP(𝑿𝒊, 𝑾𝒊) 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 𝒀𝒋
  • 30. Activation Function Inputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 ss=SOP(𝑿𝒊, 𝑾𝒊) 𝑿𝒊=Inputs 𝑾𝒊=Weights 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊 𝒀𝒋
  • 31. Activation Function Outputs B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 Class Label 𝒀𝒋
  • 33. Activation Functions Which activation function to use? B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Outputs Class Labels Activation Function TWO Class Labels TWO Outputs One that gives two outputs. B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Which activation function to use? 𝑪𝒋𝒀𝒋
  • 35. Activation Function B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋
  • 36. Bias B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
  • 37. Bias B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
  • 38. Bias B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝑿 𝟎 = +𝟏
  • 39. Bias B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(+𝟏 ∗ 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
  • 40. Bias B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
  • 41. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑)
  • 42. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y
  • 43. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b
  • 44. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b
  • 45. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept
  • 46. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept b=0
  • 47. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept b=0
  • 48. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept b=+v
  • 49. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept b=+v
  • 50. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept b=-v
  • 51. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) X Y y=x+b Y-Intercept b=+v
  • 52. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) Same Concept Applies to Bias S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS
  • 53. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS
  • 54. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS
  • 55. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS
  • 56. Bias Importance B (BLUE)G (GREEN)R (RED) 00255 RED 6880248 25500 BLUE 2101567 Input Output R G B RED/BLUE W1 W2 W3 F(s)s 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) s=(𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) S= 𝟏 𝒎 𝑿𝒊 𝑾𝒊+BIAS
  • 57. Learning Rate R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏
  • 58. Summary of Parameters Inputs 𝑿 𝒎 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑)
  • 59. Summary of Parameters Weights 𝑾 𝒎 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑) 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑)
  • 60. Summary of Parameters Bias 𝒃 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  • 61. Summary of Parameters Sum Of Products (SOP) 𝒔 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  • 62. Summary of Parameters Activation Function 𝒔𝒈𝒏 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  • 63. Summary of Parameters Outputs 𝒀𝒋 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  • 64. Summary of Parameters Learning Rate η R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  • 65. Other Parameters Step n R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏𝒏 = 𝟎, 𝟏, 𝟐, … 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  • 66. Other Parameters Desired Output 𝒅𝒋 R G B RED/BLUE W1 W2 W3 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn 𝒀𝒋 =+1𝑿 𝟎 W0 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) 𝟎 ≤ η ≤ 𝟏𝒏 = 𝟎, 𝟏, 𝟐, … B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 𝒅 𝒏 = −𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟏 (𝑹𝑬𝑫) +𝟏, 𝒙 𝒏 𝒃𝒆𝒍𝒐𝒏𝒈𝒔 𝒕𝒐 𝑪𝟐 (𝑩𝑳𝑼𝑬) 𝑿(𝒏)=(𝑿 𝟎, 𝑿 𝟏,𝑿 𝟐, 𝑿 𝟑) W(𝒏)=(𝑾 𝟎, 𝑾 𝟏,𝑾 𝟐, 𝑾 𝟑)
  • 67. Neural Networks Training Steps Weights Initialization Inputs Application Sum of Inputs-Weights Products Activation Function Response Calculation Weights Adaptation Back to Step 2 1 2 3 4 5 6
  • 68. Regarding 5th Step: Weights Adaptation • If the predicted output Y is not the same as the desired output d, then weights are to be adapted according to the following equation: 𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏) Where 𝑾 𝒏 = [𝒃 𝒏 , 𝑾 𝟏(𝒏), 𝑾 𝟐(𝒏), 𝑾 𝟑(𝒏), … , 𝑾 𝒎(𝒏)]
  • 69. Neural Networks Training Example Step n=0 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=0: η = .001 𝑋 𝑛 = 𝑋 0 = +1, 255, 0, 0 𝑊 𝑛 = 𝑊 0 = −1, −2, 1, 6.2 𝑑 𝑛 = 𝑑 0 = −1 B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 70. Neural Networks Training Example Step n=0 255 0 0 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 -1 RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 71. Neural Networks Training Example Step n=0 - SOP 255 0 0 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 -1 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1+255*-2+0*1+0*6.2 =-511 RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 72. Neural Networks Training Example Step n=0 - Output 255 0 0 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 -511 =+1𝑿 𝟎 -1 𝒀 𝒏 = 𝒀 𝟎 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −𝟓𝟏𝟏 = −𝟏 sgn RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  • 73. Neural Networks Training Example Step n=0 - Output 255 0 0 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 -511 −𝟏 =+1𝑿 𝟎 -1 𝒀 𝒏 = 𝒀 𝟎 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −𝟓𝟏𝟏 = −𝟏 -1 RED B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 74. Neural Networks Training Example Step n=0 Predicted Vs. Desired 255 0 0 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 -511 −𝟏 =+1𝑿 𝟎 -1 -1 𝒀 𝒏 = 𝒀 𝟎 = −𝟏 𝐝 𝒏 = 𝒅 𝟎 = −𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 75. Neural Networks Training Example Step n=1 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=1: η = .001 𝑋 𝑛 = 𝑋 1 = +1, 248, 80, 68 𝑊 𝑛 = 𝑊 1 = 𝑊 0 = −1, −2, 1, 6.2 𝑑 𝑛 = 𝑑 1 = −1 B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 76. Neural Networks Training Example Step n=1 248 80 68 RED/BLUE -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 -1 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 77. Neural Networks Training Example Step n=1 - SOP 248 80 68 RED/BLUE -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 -1 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1+248*-2+80*1+68*6.2 =4.6 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 78. Neural Networks Training Example Step n=1 - Output 248 80 68 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 4.6 =+1𝑿 𝟎 -1 𝒀 𝒏 = 𝒀 𝟏 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 𝟒. 𝟔 = +𝟏 sgn RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  • 79. Neural Networks Training Example Step n=1 - Output 248 80 68 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 4.6 +𝟏 =+1𝑿 𝟎 -1 𝒀 𝒏 = 𝒀 𝟏 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 𝟒. 𝟔 = +𝟏 +1 BLUE B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 80. Neural Networks Training Example Step n=1 Predicted Vs. Desired 248 80 68 -2 1 6.2 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 4.6 +𝟏 =+1𝑿 𝟎 -1 +1 𝒀 𝒏 = 𝒀 𝟏 = +𝟏 𝐝 𝒏 = 𝒅 𝟏 = −𝟏 ∵ 𝒀 𝒏 ≠ 𝒅 𝒏 ∴ Weights are Incorrect. Adaptation Required B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 81. Weights Adaptation • According to 𝑾 𝒏 + 𝟏 = 𝑾 𝒏 + η 𝒅 𝒏 − 𝒀 𝒏 𝑿(𝒏) • Where n = 1 𝑾 𝟏 + 𝟏 = 𝑾 𝟏 + η 𝒅 𝟏 − 𝒀 𝟏 𝑿(𝟏) 𝑾 𝟐 = −𝟏, −𝟐, 𝟏, 𝟔. 𝟐 + .001 −𝟏 − (+𝟏) +𝟏, 𝟐𝟒𝟖, 𝟖𝟎, 𝟔𝟖 𝑾 𝟐 = −𝟏, −𝟐, 𝟏, 𝟔. 𝟐 + .001 −𝟐 +𝟏, 𝟐𝟒𝟖, 𝟖𝟎, 𝟔𝟖 𝑾 𝟐 = −𝟏, −𝟐, 𝟏, 𝟔. 𝟐 + −. 𝟎𝟎𝟐 +𝟏, 𝟐𝟒𝟖, 𝟖𝟎, 𝟔𝟖 𝑾 𝟐 = −𝟏, −𝟐, 𝟏, 𝟔. 𝟐 + −. 𝟎𝟎𝟐, −. 𝟒𝟗𝟔, −. 𝟏𝟔, −. 𝟏𝟑𝟔 𝑾 𝟐 = −𝟏. 𝟎𝟎𝟐, −𝟐. 𝟒𝟗𝟔, . 𝟖𝟒, 𝟔. 𝟎𝟔𝟒
  • 82. Neural Networks Training Example Step n=2 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=2: η = .001 𝑋 𝑛 = 𝑋 2 = +1, 0, 0, 255 𝑊 𝑛 = 𝑊 2 = −1.002, −2.496, .84, 6.064 𝑑 𝑛 = 𝑑 2 = +1 B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 83. Neural Networks Training Example Step n=2 0 0 255 RED/BLUE 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  • 84. Neural Networks Training Example Step n=2 - SOP 0 0 255 RED/BLUE 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1.002+0*- 2.496+0*.84+255*6.064 =1545.32 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  • 85. Neural Networks Training Example Step n=2 - Output 0 0 255 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 1545 .32 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟐 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 1545.32 = +𝟏 sgn RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  • 86. Neural Networks Training Example Step n=2 - Output 0 0 255 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 1545 .32 +𝟏 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟐 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 1545.32 = +𝟏 +1 BLUE B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  • 87. Neural Networks Training Example Step n=2 Predicted Vs. Desired 0 0 255 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 1545 .32 +𝟏 =+1𝑿 𝟎 +1 𝒀 𝒏 = 𝒀 𝟐 = +𝟏 𝐝 𝒏 = 𝒅 𝟐 = +𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  • 88. Neural Networks Training Example Step n=3 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=3: η = .001 𝑋 𝑛 = 𝑋 3 = +1, 67, 15, 210 𝑊 𝑛 = 𝑊 3 = 𝑊 2 = −1.002, −2.496, .84, 6.064 𝑑 𝑛 = 𝑑 3 = +1 B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 89. Neural Networks Training Example Step n=3 67 15 210 RED/BLUE 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  • 90. Neural Networks Training Example Step n=3 - SOP 67 15 210 RED/BLUE 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1.002+67*- 2.496+15*.84+210*6.064 =1349.542 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  • 91. Neural Networks Training Example Step n=3 - Output 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 1349 .542 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟑 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 1349.542 = +𝟏 sgn RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 67 15 210 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  • 92. Neural Networks Training Example Step n=3 - Output 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 1349 .542 +𝟏 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟑 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 1349.542 = +𝟏 +1 BLUE B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 67 15 210
  • 93. Neural Networks Training Example Step n=3 Predicted Vs. Desired 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 1349 .542 +𝟏 =+1𝑿 𝟎 +1 𝒀 𝒏 = 𝒀 𝟑 = +𝟏 𝐝 𝒏 = 𝒅 𝟑 = +𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 67 15 210
  • 94. Neural Networks Training Example Step n=4 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=4: η = .001 𝑋 𝑛 = 𝑋 4 = +1, 255, 0, 0 𝑊 𝑛 = 𝑊 4 = 𝑊 3 = −1.002, −2.496, .84, 6.064 𝑑 𝑛 = 𝑑 4 = −1 B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 95. Neural Networks Training Example Step n=4 255 0 0 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  • 96. Neural Networks Training Example Step n=4 - SOP 255 0 0 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1.002+255*- 2.496+0*.84+0*6.064 =-637.482 RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  • 97. Neural Networks Training Example Step n=4 - Output 255 0 0 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 - 637. 482 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟒 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −637.482 = −𝟏 sgn RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  • 98. Neural Networks Training Example Step n=4 - Output 255 0 0 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 - 637. 482 −𝟏 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟒 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −637.482 = −𝟏 -1 RED B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  • 99. Neural Networks Training Example Step n=4 Predicted Vs. Desired 255 0 0 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 - 637. 482 −𝟏 =+1𝑿 𝟎 -1 𝒀 𝒏 = 𝒀 𝟒 = −𝟏 𝐝 𝒏 = 𝒅 𝟒 = −𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  • 100. Neural Networks Training Example Step n=5 • In each step in the solution, the parameters of the neural network must be known. • Parameters of step n=5: η = .001 𝑋 𝑛 = 𝑋 5 = +1, 248, 80, 68 𝑊 𝑛 = 𝑊 5 = 𝑊 4 = −1.002, −2.496, .84, 6.064 𝑑 𝑛 = 𝑑 5 = −1 B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567
  • 101. Neural Networks Training Example Step n=5 248 80 68 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  • 102. Neural Networks Training Example Step n=5 - SOP 248 80 68 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1.002+248*- 2.496+80*.84+68*6.064 =-31.306 RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002
  • 103. Neural Networks Training Example Step n=5 - Output 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 - 31.3 06 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟓 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −31.306 = −𝟏 sgn RED/BLUE 𝒀(𝒏) B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 248 80 68 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  • 104. Neural Networks Training Example Step n=5 - Output 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 - 31.3 06 −𝟏 =+1𝑿 𝟎 𝒀 𝒏 = 𝒀 𝟓 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 −31.306 = −𝟏 -1 RED B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 248 80 68
  • 105. Neural Networks Training Example Step n=5 Predicted Vs. Desired 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 - 637. 482 −𝟏 =+1𝑿 𝟎 -1 𝒀 𝒏 = 𝒀 𝟓 = −𝟏 𝐝 𝒏 = 𝒅 𝟓 = −𝟏 ∵ 𝒀 𝒏 = 𝒅 𝒏 ∴ Weights are Correct. No Adaptation B (BLUE)G (GREEN)R (RED) 00255 RED = -1 6880248 25500 BLUE = +1 2101567 6.064 .84 -2.496 -1.002 248 80 68
  • 106. Correct Weights • After testing the weights across all samples and results were correct then we can conclude that current weights are correct ones for training the neural network. • After training phase we come to testing the neural network. • What is the class of the unknown color of values of R=150, G=100, B=180?
  • 107. Testing Trained Neural Network (R, G, B) = (150, 100, 180) Trained Neural Networks Parameters η = .001 𝑊 = −1.002, −2.496, .84, 6.064
  • 108. Testing Trained Neural Network (R, G, B) = (150, 100, 180) SOP 150 100 180 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 F(s)s sgn =+1𝑿 𝟎 s=(𝑿 𝟎 𝑾 𝟎+𝑿 𝟏 𝑾 𝟏+𝑿 𝟐 𝑾 𝟐+𝑿 𝟑 𝑾 𝟑) =+1*-1.002+150*- 2.496+100*.84+180*6.064 =800.118 RED/BLUE 𝒀 6.064 .84 -2.496 -1.002
  • 109. Testing Trained Neural Network (R, G, B) = (150, 100, 180) Output 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 800. 118 =+1𝑿 𝟎 𝒀 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 800.118 = +𝟏 sgn RED/BLUE 𝒀 6.064 .84 -2.496 -1.002 150 100 180 𝒔𝒈𝒏 𝒔 = +𝟏, 𝒔 ≥ 𝟎 −𝟏, 𝒔 < 𝟎
  • 110. Testing Trained Neural Network (R, G, B) = (150, 100, 180) Output 𝑿 𝟏 𝑿 𝟐 𝑿 𝟑 800. 118 +𝟏 =+1𝑿 𝟎 𝒀 = 𝑺𝑮𝑵 𝒔 = 𝑺𝑮𝑵 800.118 = +𝟏 +1 BLUE 6.064 .84 -2.496 -1.002 150 100 180