1. TYPES OF ANN-
SINGLE AND MULTI
LAYER
Submitted to Submitted by
Dr. (Mrs.) Lini Mathew Divya Shakti
Head of the Department ME(R) IC
Electrical Engineering Department 162511
NITTTR, Chandigarh
3. INTRODUCTION
An artificial neural network (ANN) is a computational
model based on the structure and functions of biological
neural networks.
Information that flows through the network affects the
structure of the ANN because a neural network changes or
learns, in a sense based on that input and output.
4/13/2017 DIVYA SHAKTI 3
4. INTRODUCTION
An artificial neural network is an interconnected group
of nodes, to the vast network of neurons in a brain.
4/13/2017 DIVYA SHAKTI 4
5. STRUCTURE OF ANN
ANNs have three layers that are interconnected.
The first layer consists of input neurons. Those neurons
send data on to the second layer, which in turn sends the
output neurons to the third layer.
Training an artificial neural network involves choosing
from allowed models for which there are several associated
algorithms.
4/13/2017 DIVYA SHAKTI 5
6. CLASSIFICATION OF ANNS
With available ANN models that exist in the literature, there
are so many ways that ANNs may be classified based on
setting different criteria, and there is no limit:
1. Based on their characteristics such as network architecture
2. Training/learning
3. Activation function.
4/13/2017 DIVYA SHAKTI 6
7. NEURAL NETWORK
ARCHITECTURE
It is convenient to visualize neurons as arranged in layers.
In addition, it can be understood that neurons in the same
layer behave in the same manner.
Based on the number of layers the NNs can be classified
as follows:
1. Single layer
2. Multi layer
4/13/2017 DIVYA SHAKTI 7
8. SINGLE-LAYER NEURAL
NETWORK
A single layer NN has one layer of connection weights.
The units can be distinguished as input units, which receive
signals from the outside world, and output units, from which
the response of the network can be read.
4/13/2017 DIVYA SHAKTI 8
9. SINGLE-LAYER NEURAL
NETWORK
The shaded nodes on the left are in the so-called input
layer. The input layer neurons are to only pass and
distribute the inputs and perform no computation.
The only true layer of neurons is the one on the right.
4/13/2017 DIVYA SHAKTI 9
10. Each of the inputs is connected to every artificial neuron
in the output layer through the connection weight.
Every value of outputs is calculated from the same set of
input values, each output is varied based on the connection
weights.
The presented network is fully connected, the true
biological neural network may not have all possible
connections - the weight value of zero can be represented as
``no connection".
4/13/2017 DIVYA SHAKTI 10
11. MULTI-LAYER NEURAL
NETWORK
A multi-layer NN is a network with two or more layers of
nodes between the input units and the output units.
Multi-layer network can solve more complicated problems
than single-layer nets but training may be more difficult.
In some cases, training may be more successful, because it is
possible to solve a problem that a single-layer net cannot be
trained to perform correctly at all.
4/13/2017 DIVYA SHAKTI 11
12. MULTI-LAYER NEURAL
NETWORK
The multilayer neural network which distinguishes itself
from the single-layer network by having one or more hidden
layers.
In this multilayer structure, the input nodes pass the
information to the units in the first hidden layer, then the
outputs from the first hidden layer are passed to the next
layer, and so on.
4/13/2017 DIVYA SHAKTI 12
14. CLASSIFICATION ON THE
BASIS OF DIRECTION OF DATA
FLOW
1. Feed-forward NNs
2. Feed-back NNs
3. Recurrent NNs.
4/13/2017 DIVYA SHAKTI 14
15. EXTREME LEARNING MACHINE WITH A
DETERMINISTIC ASSIGNMENT OF HIDDEN
WEIGHTS IN TWO PARALLEL LAYERS
Abstract- Extreme learning machine (ELM) is a machine
learning technique based on competitive single-hidden layer
feed-forward neural network (SLFN). However, traditional
ELM and its variants are only based on random assignment of
hidden weights using a uniform distribution, and then the
calculation of the weights output using the least-squares
method.
4/13/2017 DIVYA SHAKTI 15
Pablo A. Henríquez , Gonzalo A. Ruz-Elsevier
http://doi.org/10.1016/j.neucom.2016.11.040
16. RESEARCH PAPER
This paper proposes a new architecture based on a
non-linear layer in parallel by another non-linear
layer and with entries of independent weights.
We explore the use of a deterministic assignment of the
hidden weight values.
The simulations are performed with Halton and Sobol
sequences(Algorithms).
4/13/2017 DIVYA SHAKTI 16