SlideShare a Scribd company logo
1
A REPORT
ON
APPLICATIONS OF FUZZY LOGIC IN NEURAL
NETWORKS
By
Akath Singh Dua 2012B4A7333P
Shikhar Agarwal 2012B4A8613P
Keshav Raheja 2012B4A8678P
Prepared in partial fulfilment of
EA C482/BITS F343 (Fuzzy Logic and Application)
Submitted to
Mrs. Shivi Agarwal
Assistant Professor, Dept. of Mathematics
BIRLA INSTITUTE OF TECHNOLOGY & SCIENCE,
PILANI
(November, 2015)
2
TABLE OF CONTENTS
1. LIST OF FIGURES……………………………………………………..3
2. ACKNOWLEDGEMENTS…………………………………………….4
3. ABSTRACT……………………………………………………………..5
4. CHAPTER 1……………………………………………………..............6
5. CHAPTER 2……………………………………………………………10
6. REFERENCES…………………………………………………….......16
3
LIST OF FIGURES
 Figure 1: Neuron vs. Node
 Figure 2: Artificial Neural Network Model
 Figure 3: Neural Network Architecture
 Figure 4: Generalized Fuzzy Neuron
 Figure 5: Sigmoidal Functions
 Figure 6: OR Fuzzy Neuron
 Figure 7: AND Fuzzy Neuron
 Figure 8: OR/AND Fuzzy Neuron
 Figure 9: Multi-layered Fuzzy Neural Networks
4
ACKNOWLEDGEMENT
We would like to express our gratitude to Prof. Shivi Agarwal, Instructor-in-charge; Fuzzy
Logic and Applications, for providing us this wonderful opportunity to learn Fuzzy Logic and
Neural Networks and to carry out this project successfully.
And last but not the least, we would like to thank our seniors for their constant support and
supervision, and also for explaining us about Fuzzy Neural Networks. We are appreciative of
our friends for encouraging us and for their unwavering support.
We cannot end this Acknowledgement without extending a warm gratitude to Google,
Wikipedia and Our Textbook which were an indispensable tool in developing this project.
-The Authors
5
ABSTARCT
The report gives a detailed introduction to Artificial Neural Networks and the applications of
Fuzzy Logic in particular fuzzy set theory to the Artificial Neural Networks. The first half of
the report describes the activation function used in the discussion of Neural Networks the
sigmoidal function which is the most commonly used function to describe the computation of
activation of a Neuron. The weights for the network are computed using the Back
Propagation algorithm along with gradient descent which is used as an optimization
algorithm to find the optimal weights and the steps involved in computation of the weights.
The second half of the report describes the application of Fuzzy systems in Neural Networks
and gives a detail description of different kinds Fuzzy Neurons, supervised learning in Fuzzy
neurons and Multi-layered Fuzzy Neural Networks. The last part of the report talks about the
advantages of Fuzzy Logic systems over the Neural Networks showing the Neural Fuzzy
systems being computationally more efficient than the Neural Networks.
Key Words -: Neural Networks, Fuzzy Systems, Fuzzy Neurons, Gradient Descent, Back
Propagation.
6
CHAPTER-1
Artificial Neural Networks
1.1.OVERVIEW
The past two decades have seen an explosion of renewed interest in the areas of Artificial
Intelligence and Information Processing. Much of this interest has come about with the
successful demonstration of real-world applications of Artificial Neural Networks (ANNs)
and their ability to learn. Initially proposed during the 1950s, the technology suffered a roller
coaster development accompanied by exaggerated claims of their virtues. ANNs have only
recently found a reasonable degree of respectability as a tool suitable for achieving a
nonlinear mapping between an input and output space.
The branch of artificial intelligence called neural networks dates back to the 1940s, when
McCulloch and Pitts developed the first neural model. This was followed in 1962 by
the perceptron model, devised by Rosenblatt, which generated much interest because of its
ability to solve some simple pattern classification problems. This interest started to fade in
1969 when Minsky and Papert provided mathematical proofs of the limitations of the
perceptron and pointed out its weakness in computation. Such drawbacks led to the
temporary decline of the field of neural networks.
The last decade, however, has seen renewed interest in neural networks, both among
researchers and in areas of application. The development of more-powerful networks, better
training algorithms, and improved hardware have all contributed to the revival of the field.
The field has generated interest from researchers in such diverse areas as engineering,
computer science, psychology, neuroscience, physics, and mathematics. Here, we describe a
bit about the most important neural models, followed by a discussion of some of its
prominent applications.
Animals are able to react adaptively to changes in their external and internal environment,
and they use their nervous system to perform these behaviours. An appropriate
model/simulation of the nervous system should be able to produce similar responses and
behaviours in artificial systems. Inspired by the structure of the brain, a neural network
consists of a set of highly interconnected entities, called nodes or units. The nervous system
is built by relatively simple units, the neurons, so copying their behaviour and functionality
should be the solution. Each unit is designed to mimic its biological counterpart, the neuron.
Each accepts a weighted set of inputs and responds with an output.
7
Figure-1: Neuron vs Node
Artificial Neural Networks (ANNs) mimic biological information processing mechanisms.
They are typically designed to perform a nonlinear mapping from a set of inputs to a set of
outputs. ANNs are developed to try to achieve biological system type performance using a
dense interconnection of simple processing elements analogous to biological neurons. ANNs
are information driven rather than data driven. They are adaptive information processing
systems that can automatically develop operational capabilities in response to an information
environment. ANNs learn from experience and generalize from previous examples. They
modify their behaviour in response to the environment.
An artificial network is composed of many artificial neurons that are linked together
according to specific network architecture. The objective of a neural network is to transform
the inputs into meaningful outputs.
Inputs Output
Figure-2: Artificial Neuron Network Model
Artificial Neural Networks are not universal panaceas to all problems. They are really just an
alternative mathematical device for rapidly processing information and data. It can be argued
that animal and human intelligence is only a huge extension of this process. Biological
8
systems learn and then interpolate and extrapolate using slowly propagated (100 m/s)
information when compared to the propagation speed (3*108
m/s) of a signal in an electronic
system. Despite this low signal propagation speed the brain is able to perform splendid feats
of computation in everyday tasks. The reason for this enigmatic feat is the degree of
parallelism that exists within the biological brain.
1.2. NEURAL NETWORKS TRAINING
Input Layer Hidden Layer Output Layer
Figure 3: Neural Network Architecture
Types of Training
Once a network has been devised for a particular application, it is ready to be trained. To start
this process, the weights are initialized randomly. Then, the training, or learning, begins.
There are two approaches to training – supervised and unsupervised. Supervised training
involves a mechanism of providing the network with the desired output either by manually
“grading” the network’s performance or by providing the desired outputs with the inputs.
Output is compared with the corresponding target value and error is determined. The error is
fed back to the network for updating the same through its minimization.
Unsupervised training is where the network has to make sense of the inputs without outside
help. Here, the network passes through a self-organizing process. The vast bulk of networks
utilize supervised training. Unsupervised training is used to perform some initial
characterization on inputs. However, in the full blown sense of being truly self learning, it is
still just a shining promise that is not fully understood, does not completely work, and thus is
relegated to the lab.
The Back Propagation Algorithm
9
BP, an acronym for “backward propagation of errors”, is a method in the field of Machine
Learning used for training artificial neural networks and is used in conjunction with an
optimization method such as gradient descent particularly stochastic gradient descent. The
method calculates the gradient of a cost function with respect to all the weights in the
network. In general the cost function is considered with respect to the sigmoid function used
in activation. The gradient is fed to the optimization method which in turn uses it to update
the weights, in an attempt to minimize the loss function.
Back propagation requires a known, desired output for each input value in order to calculate
the loss function gradient. It is therefore usually considered to be a supervised learning
method, although it is also used in some unsupervised networks such as auto encoders. It is a
generalization of the delta rule to multi-layered feed forward networks, made possible by
using the chain rule to iteratively compute gradients for each layer. Back propagation requires
that the activation function used by the artificial neurons be differentiable.
The algorithm can be described in exactly five steps as follows -:
 Step one involves Randomly initializing weights as if the weights were initialized to
zero then we’ll have problem of symmetric weights which after each update,
parameters corresponding to inputs going into each of two hidden units are identical.
 Implement forward propagation to get hѲ (x(i)
) for any input x(i).
 Next step involves the code to compute the cost function J(Ѳ).
 Next we implement back propagation to compute the partial derivatives
.This step is performed for all the training examples by implementing a
for loop.
 Then finally gradient descent or another advanced optimization techniques are used
along with back propagation to try to minimize J(Ѳ) as a function of parameters Ѳ.
The cost function for logistic regression is as follows -:
Here m is the number of training examples, Ѳ are the weights and hѲ() is the
activation function corresponding to logistic regression where hѲ (x(i)
) = 1 / (1+ e-( Ѳ’xi)
).
Where Ѳ’ is theta transpose.
10
CHAPTER 2
APPLICATION OF FUZZY LOGIC IN NEURAL NETWORKS
2.1. Introduction to Neuro Fuzzy Systems
Since the moment that fuzzy systems become popular in industrial application, the
community perceived that the development of a fuzzy system with good performance is not
an easy task. The problem of finding membership functions and appropriate rules is
frequently a tiring process of attempt and error. This lead to the idea of applying learning
algorithms to the fuzzy systems. The neural networks, that have efficient learning algorithms,
had been presented as an alternative to automate or to support the development of tuning
fuzzy systems.
The first studies of the neuro-fuzzy systems date of the beginning of the 90’s decade, with
Jang, Lin and Lee in 1991, Berenji in 1992 and Nauck from 1993, etc. The majority of the
first applications were in process control. Gradually, its application spread for all the areas of
the knowledge like, data analysis, data classification, imperfections detection and support to
decision-making, etc.
Neural networks and fuzzy systems can be combined to join its advantages and to cure its
individual illness. Neural networks introduce its computational characteristics of learning in
the fuzzy systems and receive from them the interpretation and clarity of systems
representation. Thus, the disadvantages of the fuzzy systems are compensated by the
capacities of the neural networks. These techniques are complementary, which justifies its
use together.
A neuro-fuzzy system based on an underlying fuzzy system is trained by means of a data-
driven learning method derived from neural network theory. This heuristic only takes into
account local information to cause local changes in the fundamental fuzzy system. It can be
represented as a set of fuzzy rules at any time of the learning process, i.e., before, during and
after. The learning procedure is constrained to ensure the semantic properties of the
underlying fuzzy system. A neuro-fuzzy system approximates a n-dimensional unknown
function which is partly represented by training examples.
A neuro-fuzzy system is represented as special three-layer feedforward neural network in
which the first layer corresponds to the input variables, the second layer symbolizes the fuzzy
rules, The third layer represents the output variables. The report talks about Fuzzy Neurons
an application of Fuzzy Neural Systems.
11
2.2. THEORY AND APPLICATIONS OF FUZZY NEURONS
Fuzzy Neurons is an application of Neural Fuzzy Systems in which the neuron's activation
function is replaced with some operation used in fuzzy logic. The combination of the
weighted input values can be replaced by a combination based on operation such as T-norms,
noted , or T-conormes, noted . This modification leads to a structure of fuzzy neuron,
based on fuzzy operators.
Using fuzzy logical neurons, the output is more or less influenced by the values of inputs.
This influence depends on both the weights and the operation of fusion:
 For a neuron of type AND, the influence of its inputs having a weak weight is most
important
 For a neuron of type OR the inputs whose weight is significant are rather taken into
account.
This defined the interval of possible values for the output.
 Fuzzy model of artificial neuron can be constructed by using fuzzy operations at
single neuron level.
.
Figure 4: Generalized Fuzzy Neuron
12
Instead of weighted sum of inputs, more general aggregation function is used. Fuzzy union,
fuzzy intersection and, more generally, s-norms and t-norms can be used as an aggregation
function for the weighted input to an artificial neuron.
Figure 5: Sigmoidal Functions
 Transfer function g is linear
 If wk=0 then wk AND xk=0 while if wk=1 then wk AND xk= xk independent of xk
Figure 6: OR Fuzzy Neuron
In the generalized forms based on t-norms, operators other than min and max can be used
such as algebraic and bounded products and sums
Figure 7: AND Fuzzy Neuron
FUZZY NEURONs
13
 Both the OR and the AND logic neurons are excitatory in character.
 Issue of inhibitory (negative) weights deserves a short digression.
 In the realm of fuzzy sets operations are defined in [0, 1].
 Proper solution to make a weighted input inhibitory is to take fuzzy complement
of the excitatory membership value negation (x) = 1-x.
 Input is given by x=(x1,..xn).
SUPERVISED LEARNING IN FUZZY NEURAL NETWORKS
 The weighted inputs xi o wi, where o is a t-norm and t-conorm, can be general
fuzzy relations too, not just simple products as in standard neurons.
 The transfer function g can be a non-linear such as a sigmoid.
 Supervised learning in FNN consists in modifying their connection weights in a
such a manner that an error measure is progressively reduced.
 Its performance should remain acceptable when it is presented with new data.
 Set of training data pairs (xk, dk) for k=1,2..n
 wt+1
=wt
+ wt
, where weight change is a given function of difference between the
target response d and calculated node output y wt
=F(|dt
-yt
|).
 Mean square error E – measure of how well the fuzzy network maps input data
into the corresponding output
 E(w) = ½(dk-yk)2
2.3. OR/AND FUZZY NEURON
Figure-8- OR/AND Fuzzy Neuron
14
 This structure can produce a spectrum of intermediate behaviors that can be modified
in order to suit a given problem
 If c1 = 0 and c2 = 1 the system reduces itself to pure AND neuron
 If c1 = 1 and c2 = 0 the behavior corresponds to that of a pure OR neuron
2.4. MULTILAYERED FUZZY NEURAL NETWORKS
If we restrict ourselves to the pure two-valued Boolean case, network represents an
arbitrary Boolean function as a sum of minterms. More generally, if the values are
continuous members of a fuzzy set then these networks approximate certain unknown
fuzzy function.
Figure 9- Multilayered Fuzzy Neural Networks
2.4. Advantages of Fuzzy Logic System over Neural Networks
Fuzzy logic systems can be compared to artificial neural networks according to the structure,
the use of the same adaptive algorithms is also possible. Both structures are also universal
approximators for continuous, nonlinear functions. However, we can observe the following
important differences:
15
 The fuzzy logic system enables the inclusion of linguistic knowledge in a systematic
way. What this means for adaptive systems of fuzzy logic is that the system's initial
parameters are set extremely well. If we then use a gradient adaptive method, such as
the generalized back-propagation rule, the parameters will converge to real values. In
artificial neural networks, the non-transparent network design prevents the inclusion
of linguistic knowledge and that is why we need a random selection of initial
parameters which prolongs the learning phase.
 All parameters of the fuzzy logic system have a physical meaning. There is no such
clear connection between inputs, individual parameters, and outputs in artificial neural
networks. With the help of definitions from classical system identification, we can
place artificial neural networks into approaches according to the black box method,
and fuzzy logic systems into approaches according to the gray box method.
 Only in few examples do we not have at least the basic linguistic knowledge about the
system or the process available. In such cases, it is possible to construct a fuzzy logic
system with an adaptive algorithm which functions in the same way as the artificial
neural network. The knowledge gained later can be included in the form of initial
setting of parameters from the fuzzy logic system or in the form of the rule base
change. The fuzzy logic system with parameter adaptation can thus always replace the
artificial neural network, while the reverse is not possible. In addition, the knowledge
gained in the learning phase in adaptive fuzzy logic systems is interpretative.
 When using the artificial neural network and adaptive fuzzy logic system in solving
the same exercise, we can notice that the fuzzy logic system with adaptive parameters
is significantly less extensive than equally efficient artificial neural network. Thus, we
need less processor time for the same effect which is extremely important in real-time
application.
16
REFERENCES
1. Wikipedia.com
2. Introduction to Fuzzy Sets and Fuzzy Logic by M. Ganesh
3. neuralnetworksanddeeplearning.com/chap1.html
4. www.extremetech.com
5. http://portal.survey.ntua.gr/main/labs/rsens/DeCETI/IRIT/KNOWLEDGE-
BASED/node78.html

More Related Content

What's hot

Neural Networks
Neural Networks Neural Networks
Neural Networks
Eric Su
 
Nature Inspired Reasoning Applied in Semantic Web
Nature Inspired Reasoning Applied in Semantic WebNature Inspired Reasoning Applied in Semantic Web
Nature Inspired Reasoning Applied in Semantic Web
guestecf0af
 
VLSI IN NEURAL NETWORKS
VLSI IN NEURAL NETWORKSVLSI IN NEURAL NETWORKS
VLSI IN NEURAL NETWORKS
Mohan Moki
 

What's hot (20)

Neural networks.ppt
Neural networks.pptNeural networks.ppt
Neural networks.ppt
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Neural networks
Neural networksNeural networks
Neural networks
 
Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...
Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...
Compegence: Dr. Rajaram Kudli - An Introduction to Artificial Neural Network ...
 
Neural Networks
Neural Networks Neural Networks
Neural Networks
 
NEURAL NETWORK FOR THE RELIABILITY ANALYSIS OF A SERIES - PARALLEL SYSTEM SUB...
NEURAL NETWORK FOR THE RELIABILITY ANALYSIS OF A SERIES - PARALLEL SYSTEM SUB...NEURAL NETWORK FOR THE RELIABILITY ANALYSIS OF A SERIES - PARALLEL SYSTEM SUB...
NEURAL NETWORK FOR THE RELIABILITY ANALYSIS OF A SERIES - PARALLEL SYSTEM SUB...
 
neural networks
 neural networks neural networks
neural networks
 
40120140507007
4012014050700740120140507007
40120140507007
 
Soft Computing
Soft ComputingSoft Computing
Soft Computing
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Dissertation character recognition - Report
Dissertation character recognition - ReportDissertation character recognition - Report
Dissertation character recognition - Report
 
Pattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural NetworkPattern Recognition using Artificial Neural Network
Pattern Recognition using Artificial Neural Network
 
Fundamentals of Neural Networks
Fundamentals of Neural NetworksFundamentals of Neural Networks
Fundamentals of Neural Networks
 
Nature Inspired Reasoning Applied in Semantic Web
Nature Inspired Reasoning Applied in Semantic WebNature Inspired Reasoning Applied in Semantic Web
Nature Inspired Reasoning Applied in Semantic Web
 
Universal Artificial Intelligence for Intelligent Agents: An Approach to Supe...
Universal Artificial Intelligence for Intelligent Agents: An Approach to Supe...Universal Artificial Intelligence for Intelligent Agents: An Approach to Supe...
Universal Artificial Intelligence for Intelligent Agents: An Approach to Supe...
 
Neural Networks for Pattern Recognition
Neural Networks for Pattern RecognitionNeural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
 
VLSI IN NEURAL NETWORKS
VLSI IN NEURAL NETWORKSVLSI IN NEURAL NETWORKS
VLSI IN NEURAL NETWORKS
 
Artificial Neural Network.pptx
Artificial Neural Network.pptxArtificial Neural Network.pptx
Artificial Neural Network.pptx
 
Neural Network Implementation Control Mobile Robot
Neural Network Implementation Control Mobile RobotNeural Network Implementation Control Mobile Robot
Neural Network Implementation Control Mobile Robot
 
A04401001013
A04401001013A04401001013
A04401001013
 

Viewers also liked

Dynamic response of grid connected wind turbine with dfig
Dynamic response of grid connected wind turbine with dfigDynamic response of grid connected wind turbine with dfig
Dynamic response of grid connected wind turbine with dfig
Jose SanLeandro
 
Improved reactive power capability with grid connected doubly fed induction g...
Improved reactive power capability with grid connected doubly fed induction g...Improved reactive power capability with grid connected doubly fed induction g...
Improved reactive power capability with grid connected doubly fed induction g...
Uday Wankar
 
harmonic distortion ppt
harmonic distortion pptharmonic distortion ppt
harmonic distortion ppt
Aditi Tiwari
 
Harmonics in power system
Harmonics in power systemHarmonics in power system
Harmonics in power system
Minh Anh Nguyen
 
Fuzzy Logic Ppt
Fuzzy Logic PptFuzzy Logic Ppt
Fuzzy Logic Ppt
rafi
 

Viewers also liked (16)

Fuzzy front pages
Fuzzy front pagesFuzzy front pages
Fuzzy front pages
 
Fuzzy Logic in the Real World
Fuzzy Logic in the Real WorldFuzzy Logic in the Real World
Fuzzy Logic in the Real World
 
Fuzzy logic
Fuzzy logicFuzzy logic
Fuzzy logic
 
Chapter 5 - Fuzzy Logic
Chapter 5 - Fuzzy LogicChapter 5 - Fuzzy Logic
Chapter 5 - Fuzzy Logic
 
Ece478 12es_final_report
Ece478 12es_final_reportEce478 12es_final_report
Ece478 12es_final_report
 
Dynamic response of grid connected wind turbine with dfig
Dynamic response of grid connected wind turbine with dfigDynamic response of grid connected wind turbine with dfig
Dynamic response of grid connected wind turbine with dfig
 
DFIG_report
DFIG_reportDFIG_report
DFIG_report
 
Improved reactive power capability with grid connected doubly fed induction g...
Improved reactive power capability with grid connected doubly fed induction g...Improved reactive power capability with grid connected doubly fed induction g...
Improved reactive power capability with grid connected doubly fed induction g...
 
Fuzzy logic control of brushless dc motor
Fuzzy logic control of brushless dc motorFuzzy logic control of brushless dc motor
Fuzzy logic control of brushless dc motor
 
harmonic distortion ppt
harmonic distortion pptharmonic distortion ppt
harmonic distortion ppt
 
Doubly fed-induction-generator
Doubly fed-induction-generatorDoubly fed-induction-generator
Doubly fed-induction-generator
 
Fuzzy+logic
Fuzzy+logicFuzzy+logic
Fuzzy+logic
 
Harmonics in power system
Harmonics in power systemHarmonics in power system
Harmonics in power system
 
BLDC control using PID & FUZZY logic controller-CSD PPT
BLDC control using PID & FUZZY logic controller-CSD PPTBLDC control using PID & FUZZY logic controller-CSD PPT
BLDC control using PID & FUZZY logic controller-CSD PPT
 
Fuzzy logic ppt
Fuzzy logic pptFuzzy logic ppt
Fuzzy logic ppt
 
Fuzzy Logic Ppt
Fuzzy Logic PptFuzzy Logic Ppt
Fuzzy Logic Ppt
 

Similar to Fuzzy Logic Final Report

Neural Network
Neural NetworkNeural Network
Neural Network
Sayyed Z
 
Artificial Neural Networks
Artificial Neural NetworksArtificial Neural Networks

Similar to Fuzzy Logic Final Report (20)

Artificial Neural Network: A brief study
Artificial Neural Network: A brief studyArtificial Neural Network: A brief study
Artificial Neural Network: A brief study
 
Artificial Neural Networks: Applications In Management
Artificial Neural Networks: Applications In ManagementArtificial Neural Networks: Applications In Management
Artificial Neural Networks: Applications In Management
 
Neural network
Neural networkNeural network
Neural network
 
Neural network
Neural networkNeural network
Neural network
 
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptx
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptxEXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptx
EXPERT SYSTEMS AND ARTIFICIAL INTELLIGENCE_ Neural Networks.pptx
 
Neural Network
Neural NetworkNeural Network
Neural Network
 
B42010712
B42010712B42010712
B42010712
 
Artificial Neural Networks.pdf
Artificial Neural Networks.pdfArtificial Neural Networks.pdf
Artificial Neural Networks.pdf
 
A Study On Deep Learning
A Study On Deep LearningA Study On Deep Learning
A Study On Deep Learning
 
Deep Learning Survey
Deep Learning SurveyDeep Learning Survey
Deep Learning Survey
 
IRJET- The Essentials of Neural Networks and their Applications
IRJET- The Essentials of Neural Networks and their ApplicationsIRJET- The Essentials of Neural Networks and their Applications
IRJET- The Essentials of Neural Networks and their Applications
 
Artificial Intelligence.docx
Artificial Intelligence.docxArtificial Intelligence.docx
Artificial Intelligence.docx
 
F03503030035
F03503030035F03503030035
F03503030035
 
PADDY CROP DISEASE DETECTION USING SVM AND CNN ALGORITHM
PADDY CROP DISEASE DETECTION USING SVM AND CNN ALGORITHMPADDY CROP DISEASE DETECTION USING SVM AND CNN ALGORITHM
PADDY CROP DISEASE DETECTION USING SVM AND CNN ALGORITHM
 
Artificial Neural Networks
Artificial Neural NetworksArtificial Neural Networks
Artificial Neural Networks
 
P2-Artificial.pdf
P2-Artificial.pdfP2-Artificial.pdf
P2-Artificial.pdf
 
Artificial Neural Network report
Artificial Neural Network reportArtificial Neural Network report
Artificial Neural Network report
 
40120140507007
4012014050700740120140507007
40120140507007
 
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdfCCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
CCS355 Neural Networks & Deep Learning Unit 1 PDF notes with Question bank .pdf
 
Extracted pages from Neural Fuzzy Systems.docx
Extracted pages from Neural Fuzzy Systems.docxExtracted pages from Neural Fuzzy Systems.docx
Extracted pages from Neural Fuzzy Systems.docx
 

Fuzzy Logic Final Report

  • 1. 1 A REPORT ON APPLICATIONS OF FUZZY LOGIC IN NEURAL NETWORKS By Akath Singh Dua 2012B4A7333P Shikhar Agarwal 2012B4A8613P Keshav Raheja 2012B4A8678P Prepared in partial fulfilment of EA C482/BITS F343 (Fuzzy Logic and Application) Submitted to Mrs. Shivi Agarwal Assistant Professor, Dept. of Mathematics BIRLA INSTITUTE OF TECHNOLOGY & SCIENCE, PILANI (November, 2015)
  • 2. 2 TABLE OF CONTENTS 1. LIST OF FIGURES……………………………………………………..3 2. ACKNOWLEDGEMENTS…………………………………………….4 3. ABSTRACT……………………………………………………………..5 4. CHAPTER 1……………………………………………………..............6 5. CHAPTER 2……………………………………………………………10 6. REFERENCES…………………………………………………….......16
  • 3. 3 LIST OF FIGURES  Figure 1: Neuron vs. Node  Figure 2: Artificial Neural Network Model  Figure 3: Neural Network Architecture  Figure 4: Generalized Fuzzy Neuron  Figure 5: Sigmoidal Functions  Figure 6: OR Fuzzy Neuron  Figure 7: AND Fuzzy Neuron  Figure 8: OR/AND Fuzzy Neuron  Figure 9: Multi-layered Fuzzy Neural Networks
  • 4. 4 ACKNOWLEDGEMENT We would like to express our gratitude to Prof. Shivi Agarwal, Instructor-in-charge; Fuzzy Logic and Applications, for providing us this wonderful opportunity to learn Fuzzy Logic and Neural Networks and to carry out this project successfully. And last but not the least, we would like to thank our seniors for their constant support and supervision, and also for explaining us about Fuzzy Neural Networks. We are appreciative of our friends for encouraging us and for their unwavering support. We cannot end this Acknowledgement without extending a warm gratitude to Google, Wikipedia and Our Textbook which were an indispensable tool in developing this project. -The Authors
  • 5. 5 ABSTARCT The report gives a detailed introduction to Artificial Neural Networks and the applications of Fuzzy Logic in particular fuzzy set theory to the Artificial Neural Networks. The first half of the report describes the activation function used in the discussion of Neural Networks the sigmoidal function which is the most commonly used function to describe the computation of activation of a Neuron. The weights for the network are computed using the Back Propagation algorithm along with gradient descent which is used as an optimization algorithm to find the optimal weights and the steps involved in computation of the weights. The second half of the report describes the application of Fuzzy systems in Neural Networks and gives a detail description of different kinds Fuzzy Neurons, supervised learning in Fuzzy neurons and Multi-layered Fuzzy Neural Networks. The last part of the report talks about the advantages of Fuzzy Logic systems over the Neural Networks showing the Neural Fuzzy systems being computationally more efficient than the Neural Networks. Key Words -: Neural Networks, Fuzzy Systems, Fuzzy Neurons, Gradient Descent, Back Propagation.
  • 6. 6 CHAPTER-1 Artificial Neural Networks 1.1.OVERVIEW The past two decades have seen an explosion of renewed interest in the areas of Artificial Intelligence and Information Processing. Much of this interest has come about with the successful demonstration of real-world applications of Artificial Neural Networks (ANNs) and their ability to learn. Initially proposed during the 1950s, the technology suffered a roller coaster development accompanied by exaggerated claims of their virtues. ANNs have only recently found a reasonable degree of respectability as a tool suitable for achieving a nonlinear mapping between an input and output space. The branch of artificial intelligence called neural networks dates back to the 1940s, when McCulloch and Pitts developed the first neural model. This was followed in 1962 by the perceptron model, devised by Rosenblatt, which generated much interest because of its ability to solve some simple pattern classification problems. This interest started to fade in 1969 when Minsky and Papert provided mathematical proofs of the limitations of the perceptron and pointed out its weakness in computation. Such drawbacks led to the temporary decline of the field of neural networks. The last decade, however, has seen renewed interest in neural networks, both among researchers and in areas of application. The development of more-powerful networks, better training algorithms, and improved hardware have all contributed to the revival of the field. The field has generated interest from researchers in such diverse areas as engineering, computer science, psychology, neuroscience, physics, and mathematics. Here, we describe a bit about the most important neural models, followed by a discussion of some of its prominent applications. Animals are able to react adaptively to changes in their external and internal environment, and they use their nervous system to perform these behaviours. An appropriate model/simulation of the nervous system should be able to produce similar responses and behaviours in artificial systems. Inspired by the structure of the brain, a neural network consists of a set of highly interconnected entities, called nodes or units. The nervous system is built by relatively simple units, the neurons, so copying their behaviour and functionality should be the solution. Each unit is designed to mimic its biological counterpart, the neuron. Each accepts a weighted set of inputs and responds with an output.
  • 7. 7 Figure-1: Neuron vs Node Artificial Neural Networks (ANNs) mimic biological information processing mechanisms. They are typically designed to perform a nonlinear mapping from a set of inputs to a set of outputs. ANNs are developed to try to achieve biological system type performance using a dense interconnection of simple processing elements analogous to biological neurons. ANNs are information driven rather than data driven. They are adaptive information processing systems that can automatically develop operational capabilities in response to an information environment. ANNs learn from experience and generalize from previous examples. They modify their behaviour in response to the environment. An artificial network is composed of many artificial neurons that are linked together according to specific network architecture. The objective of a neural network is to transform the inputs into meaningful outputs. Inputs Output Figure-2: Artificial Neuron Network Model Artificial Neural Networks are not universal panaceas to all problems. They are really just an alternative mathematical device for rapidly processing information and data. It can be argued that animal and human intelligence is only a huge extension of this process. Biological
  • 8. 8 systems learn and then interpolate and extrapolate using slowly propagated (100 m/s) information when compared to the propagation speed (3*108 m/s) of a signal in an electronic system. Despite this low signal propagation speed the brain is able to perform splendid feats of computation in everyday tasks. The reason for this enigmatic feat is the degree of parallelism that exists within the biological brain. 1.2. NEURAL NETWORKS TRAINING Input Layer Hidden Layer Output Layer Figure 3: Neural Network Architecture Types of Training Once a network has been devised for a particular application, it is ready to be trained. To start this process, the weights are initialized randomly. Then, the training, or learning, begins. There are two approaches to training – supervised and unsupervised. Supervised training involves a mechanism of providing the network with the desired output either by manually “grading” the network’s performance or by providing the desired outputs with the inputs. Output is compared with the corresponding target value and error is determined. The error is fed back to the network for updating the same through its minimization. Unsupervised training is where the network has to make sense of the inputs without outside help. Here, the network passes through a self-organizing process. The vast bulk of networks utilize supervised training. Unsupervised training is used to perform some initial characterization on inputs. However, in the full blown sense of being truly self learning, it is still just a shining promise that is not fully understood, does not completely work, and thus is relegated to the lab. The Back Propagation Algorithm
  • 9. 9 BP, an acronym for “backward propagation of errors”, is a method in the field of Machine Learning used for training artificial neural networks and is used in conjunction with an optimization method such as gradient descent particularly stochastic gradient descent. The method calculates the gradient of a cost function with respect to all the weights in the network. In general the cost function is considered with respect to the sigmoid function used in activation. The gradient is fed to the optimization method which in turn uses it to update the weights, in an attempt to minimize the loss function. Back propagation requires a known, desired output for each input value in order to calculate the loss function gradient. It is therefore usually considered to be a supervised learning method, although it is also used in some unsupervised networks such as auto encoders. It is a generalization of the delta rule to multi-layered feed forward networks, made possible by using the chain rule to iteratively compute gradients for each layer. Back propagation requires that the activation function used by the artificial neurons be differentiable. The algorithm can be described in exactly five steps as follows -:  Step one involves Randomly initializing weights as if the weights were initialized to zero then we’ll have problem of symmetric weights which after each update, parameters corresponding to inputs going into each of two hidden units are identical.  Implement forward propagation to get hѲ (x(i) ) for any input x(i).  Next step involves the code to compute the cost function J(Ѳ).  Next we implement back propagation to compute the partial derivatives .This step is performed for all the training examples by implementing a for loop.  Then finally gradient descent or another advanced optimization techniques are used along with back propagation to try to minimize J(Ѳ) as a function of parameters Ѳ. The cost function for logistic regression is as follows -: Here m is the number of training examples, Ѳ are the weights and hѲ() is the activation function corresponding to logistic regression where hѲ (x(i) ) = 1 / (1+ e-( Ѳ’xi) ). Where Ѳ’ is theta transpose.
  • 10. 10 CHAPTER 2 APPLICATION OF FUZZY LOGIC IN NEURAL NETWORKS 2.1. Introduction to Neuro Fuzzy Systems Since the moment that fuzzy systems become popular in industrial application, the community perceived that the development of a fuzzy system with good performance is not an easy task. The problem of finding membership functions and appropriate rules is frequently a tiring process of attempt and error. This lead to the idea of applying learning algorithms to the fuzzy systems. The neural networks, that have efficient learning algorithms, had been presented as an alternative to automate or to support the development of tuning fuzzy systems. The first studies of the neuro-fuzzy systems date of the beginning of the 90’s decade, with Jang, Lin and Lee in 1991, Berenji in 1992 and Nauck from 1993, etc. The majority of the first applications were in process control. Gradually, its application spread for all the areas of the knowledge like, data analysis, data classification, imperfections detection and support to decision-making, etc. Neural networks and fuzzy systems can be combined to join its advantages and to cure its individual illness. Neural networks introduce its computational characteristics of learning in the fuzzy systems and receive from them the interpretation and clarity of systems representation. Thus, the disadvantages of the fuzzy systems are compensated by the capacities of the neural networks. These techniques are complementary, which justifies its use together. A neuro-fuzzy system based on an underlying fuzzy system is trained by means of a data- driven learning method derived from neural network theory. This heuristic only takes into account local information to cause local changes in the fundamental fuzzy system. It can be represented as a set of fuzzy rules at any time of the learning process, i.e., before, during and after. The learning procedure is constrained to ensure the semantic properties of the underlying fuzzy system. A neuro-fuzzy system approximates a n-dimensional unknown function which is partly represented by training examples. A neuro-fuzzy system is represented as special three-layer feedforward neural network in which the first layer corresponds to the input variables, the second layer symbolizes the fuzzy rules, The third layer represents the output variables. The report talks about Fuzzy Neurons an application of Fuzzy Neural Systems.
  • 11. 11 2.2. THEORY AND APPLICATIONS OF FUZZY NEURONS Fuzzy Neurons is an application of Neural Fuzzy Systems in which the neuron's activation function is replaced with some operation used in fuzzy logic. The combination of the weighted input values can be replaced by a combination based on operation such as T-norms, noted , or T-conormes, noted . This modification leads to a structure of fuzzy neuron, based on fuzzy operators. Using fuzzy logical neurons, the output is more or less influenced by the values of inputs. This influence depends on both the weights and the operation of fusion:  For a neuron of type AND, the influence of its inputs having a weak weight is most important  For a neuron of type OR the inputs whose weight is significant are rather taken into account. This defined the interval of possible values for the output.  Fuzzy model of artificial neuron can be constructed by using fuzzy operations at single neuron level. . Figure 4: Generalized Fuzzy Neuron
  • 12. 12 Instead of weighted sum of inputs, more general aggregation function is used. Fuzzy union, fuzzy intersection and, more generally, s-norms and t-norms can be used as an aggregation function for the weighted input to an artificial neuron. Figure 5: Sigmoidal Functions  Transfer function g is linear  If wk=0 then wk AND xk=0 while if wk=1 then wk AND xk= xk independent of xk Figure 6: OR Fuzzy Neuron In the generalized forms based on t-norms, operators other than min and max can be used such as algebraic and bounded products and sums Figure 7: AND Fuzzy Neuron FUZZY NEURONs
  • 13. 13  Both the OR and the AND logic neurons are excitatory in character.  Issue of inhibitory (negative) weights deserves a short digression.  In the realm of fuzzy sets operations are defined in [0, 1].  Proper solution to make a weighted input inhibitory is to take fuzzy complement of the excitatory membership value negation (x) = 1-x.  Input is given by x=(x1,..xn). SUPERVISED LEARNING IN FUZZY NEURAL NETWORKS  The weighted inputs xi o wi, where o is a t-norm and t-conorm, can be general fuzzy relations too, not just simple products as in standard neurons.  The transfer function g can be a non-linear such as a sigmoid.  Supervised learning in FNN consists in modifying their connection weights in a such a manner that an error measure is progressively reduced.  Its performance should remain acceptable when it is presented with new data.  Set of training data pairs (xk, dk) for k=1,2..n  wt+1 =wt + wt , where weight change is a given function of difference between the target response d and calculated node output y wt =F(|dt -yt |).  Mean square error E – measure of how well the fuzzy network maps input data into the corresponding output  E(w) = ½(dk-yk)2 2.3. OR/AND FUZZY NEURON Figure-8- OR/AND Fuzzy Neuron
  • 14. 14  This structure can produce a spectrum of intermediate behaviors that can be modified in order to suit a given problem  If c1 = 0 and c2 = 1 the system reduces itself to pure AND neuron  If c1 = 1 and c2 = 0 the behavior corresponds to that of a pure OR neuron 2.4. MULTILAYERED FUZZY NEURAL NETWORKS If we restrict ourselves to the pure two-valued Boolean case, network represents an arbitrary Boolean function as a sum of minterms. More generally, if the values are continuous members of a fuzzy set then these networks approximate certain unknown fuzzy function. Figure 9- Multilayered Fuzzy Neural Networks 2.4. Advantages of Fuzzy Logic System over Neural Networks Fuzzy logic systems can be compared to artificial neural networks according to the structure, the use of the same adaptive algorithms is also possible. Both structures are also universal approximators for continuous, nonlinear functions. However, we can observe the following important differences:
  • 15. 15  The fuzzy logic system enables the inclusion of linguistic knowledge in a systematic way. What this means for adaptive systems of fuzzy logic is that the system's initial parameters are set extremely well. If we then use a gradient adaptive method, such as the generalized back-propagation rule, the parameters will converge to real values. In artificial neural networks, the non-transparent network design prevents the inclusion of linguistic knowledge and that is why we need a random selection of initial parameters which prolongs the learning phase.  All parameters of the fuzzy logic system have a physical meaning. There is no such clear connection between inputs, individual parameters, and outputs in artificial neural networks. With the help of definitions from classical system identification, we can place artificial neural networks into approaches according to the black box method, and fuzzy logic systems into approaches according to the gray box method.  Only in few examples do we not have at least the basic linguistic knowledge about the system or the process available. In such cases, it is possible to construct a fuzzy logic system with an adaptive algorithm which functions in the same way as the artificial neural network. The knowledge gained later can be included in the form of initial setting of parameters from the fuzzy logic system or in the form of the rule base change. The fuzzy logic system with parameter adaptation can thus always replace the artificial neural network, while the reverse is not possible. In addition, the knowledge gained in the learning phase in adaptive fuzzy logic systems is interpretative.  When using the artificial neural network and adaptive fuzzy logic system in solving the same exercise, we can notice that the fuzzy logic system with adaptive parameters is significantly less extensive than equally efficient artificial neural network. Thus, we need less processor time for the same effect which is extremely important in real-time application.
  • 16. 16 REFERENCES 1. Wikipedia.com 2. Introduction to Fuzzy Sets and Fuzzy Logic by M. Ganesh 3. neuralnetworksanddeeplearning.com/chap1.html 4. www.extremetech.com 5. http://portal.survey.ntua.gr/main/labs/rsens/DeCETI/IRIT/KNOWLEDGE- BASED/node78.html