3. Feedforward Neural Networks
1. Perceptrons are arranged in layers, with the first layer taking in inputs and the last layer producing
outputs. The middle layers have no connection with the external world, and hence are called hidden
layers.
2. Each perceptron in one layer is connected to every perceptron on the next layer. Hence information is
constantly "fed forward" from one layer to the next., and this explains why these networks are called
feed-forward networks.
3. There is no connection among perceptrons in the same layer.
4. Challenges And Limitations
While feedforward neural networks are powerful, they come with their own set
of challenges and limitations. One of the main challenges is the choice of the
number of hidden layers and the number of neurons in each layer, which can
significantly affect the performance of the network.
Overfitting is another common issue where the network learns the training
data too well, including the noise, and performs poorly on new, unseen data.
In conclusion, feedforward neural networks are a foundational concept in the
field of neural networks and deep learning. They provide a straightforward
approach to modeling data and making predictions and have paved the way for
more advanced neural network architectures used in modern artificial
intelligence applications.
5. CNN
Convolutional Neural Network (CNN) is the extended version of artificial neural networks
(ANN) which is predominantly used to extract the feature from the grid-like matrix
dataset. For example visual datasets like images or videos where data patterns play an
extensive role.
Convolutional Neural Network consists of multiple layers like the input layer, Convolutional layer,
Pooling layer, and fully connected layers.
6. Sequence Learning Problems
In feedforward and convolutional neural networks the size of the input was always fixed.
For example, we fed fixed size (32 × 32) images to convolutional neural networks for image
classification.
Further, each input to the network was independent of the previous or future inputs.
7. Contd..
For example, the computations, outputs and decisions for two successive images are completely
independent of each other.
In many applications the input is not of a fixed size .
Further successive inputs may not be independent of each other. For example, consider the task of
auto completion, Given the first character ‘d’ you want to predict the next character ‘e’ and so on .
8. Contd…
Notice a few things First, successive inputs are no longer independent (while
predicting ‘e’ you would want to know what the previous input was in addition
to the current input)
Second, the length of the inputs and the number of predictions you need to
make is not fixed (for example, “learn”, “deep”, “machine” have different
number of characters)
Third, each network (orange-blue- green structure) is performing the same task
(input : character output : character)
These are known as sequence learning problems. We need to look at a
sequence of (dependent) inputs and produce an output (or outputs). Each input
corresponds to one time step .
11. Recurrent Neural Networks
Account for dependence between inputs.
Account for variable number of inputs.
Make sure that the function executed at each time step is the same.
We will focus on each of these to arrive at a model for dealing with sequences.
18. Types of RNN
One to One
One to Many
Many to One
Many to Many
One to One RNN: This type of neural network is understood because the Vanilla
Neural Network. It’s used for general machine learning problems, which contains
a single input and one output.
One to Many RNN: This type of neural network incorporates a single input and
multiple outputs. An example of this is often the image caption.
Many to One RNN: This RNN takes a sequence of inputs and generates one output.
Sentiment analysis may be a example of this sort of network where a given
sentence are often classified as expressing positive or negative sentiments.
Many to Many RNN: This RNN takes a sequence of inputs and generates a
sequence of outputs. artificial intelligence is one among the examples.