2
╸ The history of neural networks begins before the invention of
computer i.e. , in 1943.
╸ The first neural network construction is done by neurologist for
understanding the working of neurons.
╸ Later technologists are also interested in this networks.
╸ In recent years, the importance of neural networks was observed.
History of Neural Networks
What are neural networks?
╸ In information technology (IT), an artificial neural network (ANN) is a system of
hardware and/or software patterned after the operation of neurons in the human
brain. ANNs -- also called, simply, neural networks -- are a variety of deep
learning technology, which also falls under the umbrella of artificial intelligence,
or AI.
╸ Commercial applications of these technologies generally focus on solving
complex signal processing or pattern recognition problems. Examples of
significant commercial applications since 2000 include handwriting recognition
for check processing, speech-to-text transcription, oil-exploration data analysis,
weather prediction and facial recognition.
3
Working of Biological neuron:
╸ A biological neuron contains mainly four parts. They are
dendrites, cell body, axon and synapse.
4
Working of Artificial neuron:
╸ An artificial neuron also contains dendrites, cell body, axon and synapse.
╸ In artificial neural network, the inputs are taken only when threshold value
is satisfied. Otherwise inputs are not taken by the neuron.
╸ There are two modes of neurons such as, training mode and using mode.
5
Supervised learning
7
Supervised learning, as the name indicates, has the presence of a supervisor as a teacher. Basically
supervised learning is when we teach or train the machine using data that is well labelled. Which means some
data is already tagged with the correct answer. After that, the machine is provided with a new set of
examples(data) so that the supervised learning algorithm analyses the training data(set of training examples)
and produces a correct outcome from labelled data.
For instance, suppose you are given a basket filled with different kinds of fruits. Now the first step is to train
the machine with all the different fruits one by one like this:
If the shape of the object is rounded and has a depression at the top, is red in color, then it will be labeled as –
Apple.
If the shape of the object is a long curving cylinder having Green-Yellow color, then it will be labeled as –
Banana.
8
- Supervised learning is the types of machine learning in which machines are trained using well "labelled"
training data, and on basis of that data, machines predict the output. The labelled data means some input
data is already tagged with the correct output.
In supervised learning, the training data provided to the machines work as the supervisor that teaches the
machines to predict the output correctly. It applies the same concept as a student learns in the supervision of
the teacher.
Supervised learning is a process of providing input data as well as correct output data to the machine
learning model. The aim of a supervised learning algorithm is to find a mapping function to map the input
variable(x) with the output variable(y).
In the real-world, supervised learning can be used for Risk Assessment, Image classification, Fraud
Detection, spam filtering, etc.
- Supervised learning (SL) is a machine learning paradigm for problems where the available data consists
of labelled examples, meaning that each data point contains features (covariates) and an associated label.
The goal of supervised learning algorithms is learning a function that maps feature vectors (inputs) to labels
(output), based on example input-output pairs.[1] It infers a function from labeled training data consisting of a
set of training examples.[2] In supervised learning, each example is a pair consisting of an input object
(typically a vector) and a desired output value (also called the supervisory signal).
SUPERVISED LEARNING
Single Layer Perceptron
╸ The perceptron is a single processing unit of any neural network. Frank
Rosenblatt first proposed in 1958 is a simple neuron which is used to classify its input
into one or two categories. Perceptron is a linear classifier, and is used in supervised
learning. It helps to organize the given input data.
╸ A perceptron is a neural network unit that does a precise computation to detect
features in the input data. Perceptron is mainly used to classify the data into two
parts. Therefore, it is also known as Linear Binary Classifier.
╸ Perceptron uses the step function that returns +1 if the weighted sum of its input 0
and -1.
╸ The activation function is used to map the input between the required value like (0, 1)
or (-1, 1).
9
What is Backpropagation?
9
Backpropagation is the essence of neural network training. It is the method of fine-tuning the
weights of a neural network based on the error rate obtained in the previous epoch (i.e.,
iteration). Proper tuning of the weights allows you to reduce error rates and make the model
reliable by increasing its generalization.
Backpropagation in neural network is a short form for “backward propagation of errors.” It is a
standard method of training artificial neural networks. This method helps calculate the gradient
of a loss function with respect to all the weights in the network.
Most prominent advantages of Backpropagation are:
Backpropagation is fast, simple and easy to program
It has no parameters to tune apart from the numbers of input
It is a flexible method as it does not require prior knowledge about the network
It is a standard method that generally works well
It does not need any special mention of the features of the function to be learned.
12
How Supervised Learning Works?
In supervised learning, models are trained using labelled dataset, where the model
learns about each type of data. Once the training process is completed, the model is
tested on the basis of test data (a subset of the training set), and then it predicts the
output.
The working of Supervised learning can be easily understood by the below example
and diagram shown in next slide.
Suppose we have a dataset of different types of shapes which includes
square, rectangle, triangle, and Polygon. Now the first step is that we need
to train the model for each shape.
If the given shape has four sides, and all the sides are equal, then it will be
labelled as a Square.
If the given shape has three sides, then it will be labelled as a triangle.
If the given shape has six equal sides then it will be labelled as hexagon.
Now, after training, we test our model using the test set, and the task of
the model is to identify the shape.
The machine is already trained on all types of shapes, and when it finds a
new shape, it classifies the shape on the bases of a number of sides, and
predicts the output.
Application of neural networks
- Neural networks have broad applicability to real world business problems.
In fact, they have already been successfully applied in many industries.
- Mobile computing
- Forecasting
- Character recognition
- Traveling salesman problem
- Data mining
- Game development
- Pattern recognition
14
15
ACC are also used in the following specific paradigms:
- Recognition of speakers in communication;
- Hand-written word recognition and
- Face recognition.
Pattern Recognition
18
Image recognition by CNN
One of the most popular techniques used in improving the accuracy of image classification Is
convolutional Neural Networks (CNNs for short).
Instead of feeding the entire image as an array of numbers, the image is broken up into a number of
Tiles, the machines then tries to predict what each tile is.
Finally, the computer tries to predict what’s in the picture based on the prediction of all the titles.
This allows the computer to parallelize the operations and detect the object regardless of where it is
located in the Image.
Merits:
No need to write aya algorithms.
Work by learning.
Work will be automatically shared.
Robust.
Neural networks works efficiently.
19
De-merits:
1. Needs to understand before working with neural networks.
2. Requires high processing time for large neural networks.
3. Noisy data.
4. Takes large time for connecting neurons.
Conclusion
- The computer world has a lot to gain from neural networks.
- Their ability to learn by example makes them very flexible and powerful
- They are also very well suited for real time system
- Neural networks also contribute to other areas of research such as neurology and
psychology
- Finally, I would link to state that even though neural networks have a huge potential
we will only get the best of them.
When they are integrated with computing, AI, fuzzy logic and related subjects.