The differences between learning and training in neural networks have been studied.Explanations of adapting during the neural network training process are also provided.
6. Backpropagation
Training
. . . . . . . .
e e d e d b d c e d b e d c
a d a d b a c a d b a d c a
= = + = +
7. Training a Neural Network means finding the appropriate Weights of the Neural
Connections thanks to a feedback loop called Gradient Backward propagation
Backpropagation is the process of finding the gradients of the weight parameters of a NN
using chain rule, starting with the last layer, and moving backwards.
Training
9. Structural Adaptation: Finding optimal architecture is finding how many layers will be necessary for the neural networks to operate accurately
Functional Adaptation: It keeps on adapting and changing the slope of activation functions of learning models until it reaches the optimal
slope
Parameter Adaptation: If a neural network is adaptable to parameters, then the weights of the network can be changed while training
Adapting
Best learning rate ?
SEPA algorithm, cascading algorithm, constructive algorithm
Gradient Descent, Adagrad, Adam