SlideShare a Scribd company logo
1 of 476
Download to read offline
Contents




         PREFACE                                                 xiii
         ACKNOWLEDGMENTS                                          xv,
CHAPTER 1 INTRODUCTION                                             1

          1.1   Why Neural Networks, and Why Now?            1
          1.2   What Is a Neural Net? 3
                1.2.1   Artificial Neural Networks, 3
                1.2.2   Biological Neural Networks, 5
          1.3   Where Are Neural Nets Being Used? 7
                1.3.1   Signal Processing, 7
                1.3.2   Control, 8
                1.3.3   Pattern Recognition, 8
                1.3.4   Medicine, 9
                1.3.5   Speech Production, 9
                1.3.6   Speech Recognition, 10
                1.3.7   Business,        11

          1.4   How Are Neural Networks Used?           11
                1.4.1   Typical Architectures, 12
                1.4.2   Setting the Weights, 15
                1.4.3   Common Activation Functions, 17
                1.4.4   Summary of Notation, 20

                                                                  vii
Preface




There has been a resurgence of interest in artificial neural networks over the last
few years, as researchers from diverse backgrounds have produced a firm theo-
retical foundation and demonstrated numerous applications of this rich field of
study. However, the interdisciplinary nature of neural networks complicates the
development of a comprehensive, but introductory, treatise on the subject. Neural
networks are useful tools for solving many types of problems. These problems
may be characterized as mapping (including pattern association and pattern clas-
sification), clustering, and constrained optimization. There are several neural net-
works available for each type of problem. In order to use these tools effectively
it is important to understand the characteristics (strengths and limitations) of each.
        This book presents a wide variety of standard neural networks, with dia-
grams of the architecture, detailed statements of the training algorithm, and sev-
eral examples of the application for each net. In keeping with our intent to show
neural networks in a fair but objective light, typical results of simple experiments
are included (rather than the best possible). The emphasis is on computational
 characteristics, rather than psychological interpretations. TO illustrate the simi-
 larities and differences among the neural networks discussed, similar examples
 are used wherever it is appropriate.
        Fundamentals of Neural Networks has been written for students and for
 researchers in academia, industry, and govemment who are interested in using
 neural networks. It has been developed both as a textbook for a one semester,
 or two quarter, Introduction to Neural Networks course at Florida Institute of
 Technology, and as a resource book for researchers. Our course has been de-
 veloped jointly by neural networks researchers from applied mathematics, com-
                                                                                  xiii
Xiv                                                                         Preface

puter science, and computer and electrical engineering. Our students are seniors,
or graduate students, in science and engineering; many work in local industry.
      It is assumed that the reader is familiar with calculus and some vector-matrix
notation and operations. The mathematical treatment has been kept at a minimal
level, consistent with the primary aims of clarity and correctness. Derivations,
theorems and proofs are included when they serve to illustrate the important
features of a particular neural network. For example, the mathematical derivation
of the backpropagation training algorithm makes clear the correct order of the
operations. The level of mathematical sophistication increases somewhat in the
later chapters, as is appropriate for the networks presented in chapters 5, 6, and
7. However, derivations and proofs (when included) are presented at the end of
a section or chapter, SO that they can be skipped without loss of continuity.
      The order of presentation of the topics was chosen to reflect increasing
complexity of the networks. The material in each chapter is largely independent,
SO that the chapters (after the first chapter) may be used in almost any order
desired. The McCulloch-Pitts neuron discussed at the end of Chapter 1 provides
a simple example of an early neural net. Single layer nets for pattern classification
and pattern association, covered in chapters 2 and 3, are two of the earliest ap-
plications of neural networks with adaptive weights. More complex networks,
discussed in later chapters, are also used for these types of problems, as well as
for more general mapping problems. Chapter 6, backpropagation, can logically
follow chapter 2, although the networks in chapters 3-5 are somewhat simpler in
 structure. Chapters 4 and 5 treat networks for clustering problems (and mapping
 networks that are based on these clustering networks). Chapter 7 presents a few
 of the most widely used of the many other neural networks, including two for
 constrained optimization problems.
       Algorithms, rather than computer codes, are provided to encourage the
 reader to develop a thorough understanding of the mechanisms of training and
 applying the neural network, rather than fostering the more superficial familiarity
 that sometimes results from using completely developed software packages. For
 many applications, the formulation of the problem for solution by a neural network
 (and choice of an appropriate network) requires the detailed understanding of the
 networks that cornes from performing both hand calculations and developing com-
 puter codes for extremely simple examples.
Acknowledgments




Many people have helped to make this book a reality. 1 can only mention a few
of them here.
      1 have benefited either directly or indirectly from short courses on neural
networks taught by Harold Szu, Robert Hecht-Nielsen, Steven Rogers, Bernard
Widrow, and Tony Martinez.
      My thanks go also to my colleagues for stimulating discussions and en-
couragement, especially Harold K. Brown, Barry Grossman, Fred Ham, De-
metrios Lainiotis, Moti Schneider, Nazif Tepedelenlioglu, and Mike Thursby.
      My students have assisted in the development of this book in many ways;
several of the examples are based on student work. Joe Vandeville, Alan Lindsay,
and Francisco Gomez performed the computations for many of the examples in
Chapter 2. John Karp provided the results for Example 4.8. Judith Lipofsky did
Examples 4.9 and 4.10. Fred Parker obtained the results shown in Examples 4.12
and 4.13. Joseph Oslakovic performed the computations for several of the ex-
amples in Chapter 5. Laurie Walker assisted in the development of the backpro-
pagation program for several of the examples in Chapter 6; Ti-Cheng Shih did the
computations for Example 6.5; Abdallah Said developed the logarithmic activation
function used in Examples 6.7 and 6.8. Todd Kovach, Robin Schumann, and
Hong-wei Du assisted with the Boltzmann machine and Hopfield net examples
in Chapter 7; Ki-suck Yoo provided Example 7.8.
      Several of the network architecture diagrams are adapted from the original
publications as referenced in the text. The spanning tree test data (Figures 4.11,
4.12, 5.11, and 5.12) are used with permission from Springer-Verlag. The illus-
trations of modified Hebbian learning have been adapted from the original pub-
                                                                               xv
xvi                                                              Acknowledgments

lications: Figure 7.10 has been adapted from Hertz, Krogh, Palmer, Introduction
to the Theory of Neural Computation, @ 1991 by Addison-Wesley Publishing
Company, Inc. Figure 7.11 has been adapted and reprinted from Neural Networks,
Vol. 5, Xu, Oja, and Suen, Modified Hebbian Leaming for Curve and Surface
Fitting, pp. 441-457, 1992 with permission from Pergamon Press Ltd, Headington
Hi11 Hall, Oxford 0X3 OBW, UK. Several of the figures for the neocognitron are
adapted from (Fukushima, et al., 1983); they are used with permission of IEEE.
The diagrams of the ART2 architecture are used with permission of the Optical
Society of America, and Carpenter and Grossberg. The diagrams of the simple
recurrent net for learning a context sensitive grammar (Servan-Schreiber, et al.,
 1989) are used with the permission of the authors.
       The preparation of the manuscript and software for the examples has been
greatly facilitated by the use of a Macintosh IIci furnished by Apple Computers
under the AppleSeed project. 1 thank Maurice Kurtz for making it available to
me.
       1 appreciate the constructive and encouraging comments of the manuscript
reviewers: Stanley Ahalt, The Ohio State University; Peter Anderson, Rochester
 Institute of Technology; and Nirmal Bose, Penn State University.
       1 would like to thank the Prentice-Hall editorial staff, and especially Rick
 DeLorenzo, for their diligent efforts to produce an accurate and attractive product
 within the inevitable time and budget constraints.
       But first, last, and always, 1 would like to thank my husband and colleague,
 Don Fausett for introducing me to neural networks, and for his patience, en-
 couragement, and advice when asked, during the writing of this book (as well as
 other times).
FUNDAMENTALS
,   OF   NEURAL NETWORKS
1.1 WHY NEURAL NETWORKS AND WHY NOW?

     As modern computers become ever more powerful, scientists continue to be chal-
     lenged to use machines effectively for tasks that are relatively simple for humans.
     Based on examples, together with some feedback from a “teacher,” we learn
     easily to recognize the letter A or distinguish a cat from a bird. More experience
     allows us to refine our responses and improve our performance. Although even-
     tually, we may be able to describe rules by which we can make such decisions,
     these do not necessarily reflect the actual process we use. Even without a teacher,
     we can group similar patterns together. Yet another common human activity is
     trying to achieve a goal that involves maximizing a resource (time with one’s
     family, for example) while satisfying certain constraints (such as the need to earn
     a living). Each of these types of problems illustrates tasks for which computer
     solutions may be sought.
            Traditional, sequential, logic-based digital computing excels in many areas,
     but has been less successful for other types of problems. The development of
     artificial neural networks began approximately 50 years ago, motivated by a desire
     to try both to understand the brain and to emulate some of its strengths. Early


                                                                                      1
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained
Neural Networks Explained

More Related Content

What's hot

Digital Image Processing - Image Restoration
Digital Image Processing - Image RestorationDigital Image Processing - Image Restoration
Digital Image Processing - Image RestorationMathankumar S
 
Smoothing Filters in Spatial Domain
Smoothing Filters in Spatial DomainSmoothing Filters in Spatial Domain
Smoothing Filters in Spatial DomainMadhu Bala
 
Counter propagation Network
Counter propagation NetworkCounter propagation Network
Counter propagation NetworkAkshay Dhole
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationMohammed Bennamoun
 
Digital Image Processing - Image Compression
Digital Image Processing - Image CompressionDigital Image Processing - Image Compression
Digital Image Processing - Image CompressionMathankumar S
 
Learning set of rules
Learning set of rulesLearning set of rules
Learning set of rulesswapnac12
 
Back propagation
Back propagationBack propagation
Back propagationNagarajan
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesArtificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesMohammed Bennamoun
 
Introduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkIntroduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkKnoldus Inc.
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural NetworkAtul Krishna
 
Lecture 15 DCT, Walsh and Hadamard Transform
Lecture 15 DCT, Walsh and Hadamard TransformLecture 15 DCT, Walsh and Hadamard Transform
Lecture 15 DCT, Walsh and Hadamard TransformVARUN KUMAR
 
Convolutional Neural Networks
Convolutional Neural NetworksConvolutional Neural Networks
Convolutional Neural NetworksAshray Bhandare
 
Image Enhancement using Frequency Domain Filters
Image Enhancement using Frequency Domain FiltersImage Enhancement using Frequency Domain Filters
Image Enhancement using Frequency Domain FiltersKarthika Ramachandran
 
Knowledge Representation in Artificial intelligence
Knowledge Representation in Artificial intelligence Knowledge Representation in Artificial intelligence
Knowledge Representation in Artificial intelligence Yasir Khan
 
Design issues of dos
Design issues of dosDesign issues of dos
Design issues of dosvanamali_vanu
 

What's hot (20)

Digital Image Processing - Image Restoration
Digital Image Processing - Image RestorationDigital Image Processing - Image Restoration
Digital Image Processing - Image Restoration
 
Smoothing Filters in Spatial Domain
Smoothing Filters in Spatial DomainSmoothing Filters in Spatial Domain
Smoothing Filters in Spatial Domain
 
Huffman coding
Huffman coding Huffman coding
Huffman coding
 
Counter propagation Network
Counter propagation NetworkCounter propagation Network
Counter propagation Network
 
Artificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computationArtificial Neural Networks Lect1: Introduction & neural computation
Artificial Neural Networks Lect1: Introduction & neural computation
 
Digital Image Processing - Image Compression
Digital Image Processing - Image CompressionDigital Image Processing - Image Compression
Digital Image Processing - Image Compression
 
Learning set of rules
Learning set of rulesLearning set of rules
Learning set of rules
 
Hebb network
Hebb networkHebb network
Hebb network
 
Back propagation
Back propagationBack propagation
Back propagation
 
PAC Learning
PAC LearningPAC Learning
PAC Learning
 
Artificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rulesArtificial Neural Networks Lect3: Neural Network Learning rules
Artificial Neural Networks Lect3: Neural Network Learning rules
 
Introduction to Recurrent Neural Network
Introduction to Recurrent Neural NetworkIntroduction to Recurrent Neural Network
Introduction to Recurrent Neural Network
 
Artificial Neural Network
Artificial Neural NetworkArtificial Neural Network
Artificial Neural Network
 
Lecture 15 DCT, Walsh and Hadamard Transform
Lecture 15 DCT, Walsh and Hadamard TransformLecture 15 DCT, Walsh and Hadamard Transform
Lecture 15 DCT, Walsh and Hadamard Transform
 
Convolutional Neural Networks
Convolutional Neural NetworksConvolutional Neural Networks
Convolutional Neural Networks
 
Predicate logic
 Predicate logic Predicate logic
Predicate logic
 
Image Enhancement using Frequency Domain Filters
Image Enhancement using Frequency Domain FiltersImage Enhancement using Frequency Domain Filters
Image Enhancement using Frequency Domain Filters
 
Bayesian learning
Bayesian learningBayesian learning
Bayesian learning
 
Knowledge Representation in Artificial intelligence
Knowledge Representation in Artificial intelligence Knowledge Representation in Artificial intelligence
Knowledge Representation in Artificial intelligence
 
Design issues of dos
Design issues of dosDesign issues of dos
Design issues of dos
 

Viewers also liked

The Differences of between ZigBee and Bluetooth technologies
The Differences of between ZigBee and Bluetooth technologiesThe Differences of between ZigBee and Bluetooth technologies
The Differences of between ZigBee and Bluetooth technologiesCan KAYA
 
A Look at the Network: Searching for Truth in Distributed Applications
A Look at the Network: Searching for Truth in Distributed ApplicationsA Look at the Network: Searching for Truth in Distributed Applications
A Look at the Network: Searching for Truth in Distributed ApplicationsOSCON Byrum
 
Co r mlg_integration_network
Co r mlg_integration_networkCo r mlg_integration_network
Co r mlg_integration_networkThomas Jézéquel
 
Amicall overview slides key points for eurocities
Amicall overview slides key points for eurocitiesAmicall overview slides key points for eurocities
Amicall overview slides key points for eurocitiesThomas Jézéquel
 
Liudviko Rėzos kultūros centro veiklos pristatymas
Liudviko Rėzos kultūros centro veiklos pristatymasLiudviko Rėzos kultūros centro veiklos pristatymas
Liudviko Rėzos kultūros centro veiklos pristatymasausramika
 
Bringing to Market a Successful Cloud Service - Knowing When to Partner, When...
Bringing to Market a Successful Cloud Service - Knowing When to Partner, When...Bringing to Market a Successful Cloud Service - Knowing When to Partner, When...
Bringing to Market a Successful Cloud Service - Knowing When to Partner, When...LicensingLive! - SafeNet
 

Viewers also liked (18)

Bluetooth Vs Zigbee
Bluetooth Vs ZigbeeBluetooth Vs Zigbee
Bluetooth Vs Zigbee
 
The Differences of between ZigBee and Bluetooth technologies
The Differences of between ZigBee and Bluetooth technologiesThe Differences of between ZigBee and Bluetooth technologies
The Differences of between ZigBee and Bluetooth technologies
 
A Look at the Network: Searching for Truth in Distributed Applications
A Look at the Network: Searching for Truth in Distributed ApplicationsA Look at the Network: Searching for Truth in Distributed Applications
A Look at the Network: Searching for Truth in Distributed Applications
 
Blutooth vs zigbee
Blutooth vs zigbeeBlutooth vs zigbee
Blutooth vs zigbee
 
Thesis Slides
Thesis SlidesThesis Slides
Thesis Slides
 
Co r mlg_integration_network
Co r mlg_integration_networkCo r mlg_integration_network
Co r mlg_integration_network
 
Shakespeare in film_presentation
Shakespeare in film_presentationShakespeare in film_presentation
Shakespeare in film_presentation
 
Cloud Monetization: A Step by Step Guide
Cloud Monetization: A Step by Step GuideCloud Monetization: A Step by Step Guide
Cloud Monetization: A Step by Step Guide
 
Amicall overview slides key points for eurocities
Amicall overview slides key points for eurocitiesAmicall overview slides key points for eurocities
Amicall overview slides key points for eurocities
 
Liudviko Rėzos kultūros centro veiklos pristatymas
Liudviko Rėzos kultūros centro veiklos pristatymasLiudviko Rėzos kultūros centro veiklos pristatymas
Liudviko Rėzos kultūros centro veiklos pristatymas
 
122
122122
122
 
20130830 pres foyer_en
20130830 pres foyer_en20130830 pres foyer_en
20130830 pres foyer_en
 
Bringing to Market a Successful Cloud Service - Knowing When to Partner, When...
Bringing to Market a Successful Cloud Service - Knowing When to Partner, When...Bringing to Market a Successful Cloud Service - Knowing When to Partner, When...
Bringing to Market a Successful Cloud Service - Knowing When to Partner, When...
 
Trimbos institute
Trimbos instituteTrimbos institute
Trimbos institute
 
Horrible jobs
Horrible jobsHorrible jobs
Horrible jobs
 
Decland 6
Decland 6Decland 6
Decland 6
 
Nyu essay3
Nyu essay3Nyu essay3
Nyu essay3
 
SHARE Network
SHARE NetworkSHARE Network
SHARE Network
 

Similar to Neural Networks Explained

Artificial Neural Network and its Applications
Artificial Neural Network and its ApplicationsArtificial Neural Network and its Applications
Artificial Neural Network and its Applicationsshritosh kumar
 
Artificial Neural Networks.pdf
Artificial Neural Networks.pdfArtificial Neural Networks.pdf
Artificial Neural Networks.pdfBria Davis
 
Artificial Neural Network An Important Asset For Future Computing
Artificial Neural Network   An Important Asset For Future ComputingArtificial Neural Network   An Important Asset For Future Computing
Artificial Neural Network An Important Asset For Future ComputingBria Davis
 
NEURAL NETWORKS
NEURAL NETWORKSNEURAL NETWORKS
NEURAL NETWORKSESCOM
 
ANALYSIS ON MACHINE CELL RECOGNITION AND DETACHING FROM NEURAL SYSTEMS
ANALYSIS ON MACHINE CELL RECOGNITION AND DETACHING FROM NEURAL SYSTEMSANALYSIS ON MACHINE CELL RECOGNITION AND DETACHING FROM NEURAL SYSTEMS
ANALYSIS ON MACHINE CELL RECOGNITION AND DETACHING FROM NEURAL SYSTEMSIAEME Publication
 
Artificial Neural Network Abstract
Artificial Neural Network AbstractArtificial Neural Network Abstract
Artificial Neural Network AbstractAnjali Agrawal
 
Fuzzy Logic Final Report
Fuzzy Logic Final ReportFuzzy Logic Final Report
Fuzzy Logic Final ReportShikhar Agarwal
 
Neural Network
Neural NetworkNeural Network
Neural NetworkSayyed Z
 
Artificial Neural Network: A brief study
Artificial Neural Network: A brief studyArtificial Neural Network: A brief study
Artificial Neural Network: A brief studyIRJET Journal
 
An Overview On Neural Network And Its Application
An Overview On Neural Network And Its ApplicationAn Overview On Neural Network And Its Application
An Overview On Neural Network And Its ApplicationSherri Cost
 
Presentation on neural network
Presentation on neural networkPresentation on neural network
Presentation on neural networkAbhey Sharma
 
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...ijaia
 
Quantum neural network
Quantum neural networkQuantum neural network
Quantum neural networksurat murthy
 
Seminar Neuro-computing
Seminar Neuro-computingSeminar Neuro-computing
Seminar Neuro-computingAniket Jadhao
 

Similar to Neural Networks Explained (20)

Artificial Neural Network and its Applications
Artificial Neural Network and its ApplicationsArtificial Neural Network and its Applications
Artificial Neural Network and its Applications
 
[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni
[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni
[IJET V2I2P20] Authors: Dr. Sanjeev S Sannakki, Ms.Anjanabhargavi A Kulkarni
 
Artificial Neural Networks.pdf
Artificial Neural Networks.pdfArtificial Neural Networks.pdf
Artificial Neural Networks.pdf
 
Artificial Neural Network An Important Asset For Future Computing
Artificial Neural Network   An Important Asset For Future ComputingArtificial Neural Network   An Important Asset For Future Computing
Artificial Neural Network An Important Asset For Future Computing
 
Neural network
Neural networkNeural network
Neural network
 
NEURAL NETWORKS
NEURAL NETWORKSNEURAL NETWORKS
NEURAL NETWORKS
 
ANALYSIS ON MACHINE CELL RECOGNITION AND DETACHING FROM NEURAL SYSTEMS
ANALYSIS ON MACHINE CELL RECOGNITION AND DETACHING FROM NEURAL SYSTEMSANALYSIS ON MACHINE CELL RECOGNITION AND DETACHING FROM NEURAL SYSTEMS
ANALYSIS ON MACHINE CELL RECOGNITION AND DETACHING FROM NEURAL SYSTEMS
 
Artificial Neural Network Abstract
Artificial Neural Network AbstractArtificial Neural Network Abstract
Artificial Neural Network Abstract
 
Neural networks report
Neural networks reportNeural networks report
Neural networks report
 
Fuzzy Logic Final Report
Fuzzy Logic Final ReportFuzzy Logic Final Report
Fuzzy Logic Final Report
 
Neural Networks: Introducton
Neural Networks: IntroductonNeural Networks: Introducton
Neural Networks: Introducton
 
Project Report -Vaibhav
Project Report -VaibhavProject Report -Vaibhav
Project Report -Vaibhav
 
Neural Network
Neural NetworkNeural Network
Neural Network
 
Artificial Neural Network: A brief study
Artificial Neural Network: A brief studyArtificial Neural Network: A brief study
Artificial Neural Network: A brief study
 
An Overview On Neural Network And Its Application
An Overview On Neural Network And Its ApplicationAn Overview On Neural Network And Its Application
An Overview On Neural Network And Its Application
 
Presentation on neural network
Presentation on neural networkPresentation on neural network
Presentation on neural network
 
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
NETWORK LEARNING AND TRAINING OF A CASCADED LINK-BASED FEED FORWARD NEURAL NE...
 
Quantum neural network
Quantum neural networkQuantum neural network
Quantum neural network
 
OSPEN: an open source platform for emulating neuromorphic hardware
OSPEN: an open source platform for emulating neuromorphic hardwareOSPEN: an open source platform for emulating neuromorphic hardware
OSPEN: an open source platform for emulating neuromorphic hardware
 
Seminar Neuro-computing
Seminar Neuro-computingSeminar Neuro-computing
Seminar Neuro-computing
 

More from Zarnigar Altaf

Modeling of reactive system with finite automata
Modeling of reactive system with finite automataModeling of reactive system with finite automata
Modeling of reactive system with finite automataZarnigar Altaf
 
COMPARISON OF SHORT RANGE WIRELESS NETWORKS (PAN’ s)
COMPARISON OF SHORT RANGE WIRELESS NETWORKS (PAN’ s) COMPARISON OF SHORT RANGE WIRELESS NETWORKS (PAN’ s)
COMPARISON OF SHORT RANGE WIRELESS NETWORKS (PAN’ s) Zarnigar Altaf
 
Black magic presentation
Black magic presentationBlack magic presentation
Black magic presentationZarnigar Altaf
 
Neural network Algos formulas
Neural network Algos formulasNeural network Algos formulas
Neural network Algos formulasZarnigar Altaf
 

More from Zarnigar Altaf (6)

Modeling of reactive system with finite automata
Modeling of reactive system with finite automataModeling of reactive system with finite automata
Modeling of reactive system with finite automata
 
Wireless networks
Wireless networksWireless networks
Wireless networks
 
COMPARISON OF SHORT RANGE WIRELESS NETWORKS (PAN’ s)
COMPARISON OF SHORT RANGE WIRELESS NETWORKS (PAN’ s) COMPARISON OF SHORT RANGE WIRELESS NETWORKS (PAN’ s)
COMPARISON OF SHORT RANGE WIRELESS NETWORKS (PAN’ s)
 
Black magic presentation
Black magic presentationBlack magic presentation
Black magic presentation
 
Perceptron working
Perceptron workingPerceptron working
Perceptron working
 
Neural network Algos formulas
Neural network Algos formulasNeural network Algos formulas
Neural network Algos formulas
 

Neural Networks Explained

  • 1.
  • 2.
  • 3. Contents PREFACE xiii ACKNOWLEDGMENTS xv, CHAPTER 1 INTRODUCTION 1 1.1 Why Neural Networks, and Why Now? 1 1.2 What Is a Neural Net? 3 1.2.1 Artificial Neural Networks, 3 1.2.2 Biological Neural Networks, 5 1.3 Where Are Neural Nets Being Used? 7 1.3.1 Signal Processing, 7 1.3.2 Control, 8 1.3.3 Pattern Recognition, 8 1.3.4 Medicine, 9 1.3.5 Speech Production, 9 1.3.6 Speech Recognition, 10 1.3.7 Business, 11 1.4 How Are Neural Networks Used? 11 1.4.1 Typical Architectures, 12 1.4.2 Setting the Weights, 15 1.4.3 Common Activation Functions, 17 1.4.4 Summary of Notation, 20 vii
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9. Preface There has been a resurgence of interest in artificial neural networks over the last few years, as researchers from diverse backgrounds have produced a firm theo- retical foundation and demonstrated numerous applications of this rich field of study. However, the interdisciplinary nature of neural networks complicates the development of a comprehensive, but introductory, treatise on the subject. Neural networks are useful tools for solving many types of problems. These problems may be characterized as mapping (including pattern association and pattern clas- sification), clustering, and constrained optimization. There are several neural net- works available for each type of problem. In order to use these tools effectively it is important to understand the characteristics (strengths and limitations) of each. This book presents a wide variety of standard neural networks, with dia- grams of the architecture, detailed statements of the training algorithm, and sev- eral examples of the application for each net. In keeping with our intent to show neural networks in a fair but objective light, typical results of simple experiments are included (rather than the best possible). The emphasis is on computational characteristics, rather than psychological interpretations. TO illustrate the simi- larities and differences among the neural networks discussed, similar examples are used wherever it is appropriate. Fundamentals of Neural Networks has been written for students and for researchers in academia, industry, and govemment who are interested in using neural networks. It has been developed both as a textbook for a one semester, or two quarter, Introduction to Neural Networks course at Florida Institute of Technology, and as a resource book for researchers. Our course has been de- veloped jointly by neural networks researchers from applied mathematics, com- xiii
  • 10. Xiv Preface puter science, and computer and electrical engineering. Our students are seniors, or graduate students, in science and engineering; many work in local industry. It is assumed that the reader is familiar with calculus and some vector-matrix notation and operations. The mathematical treatment has been kept at a minimal level, consistent with the primary aims of clarity and correctness. Derivations, theorems and proofs are included when they serve to illustrate the important features of a particular neural network. For example, the mathematical derivation of the backpropagation training algorithm makes clear the correct order of the operations. The level of mathematical sophistication increases somewhat in the later chapters, as is appropriate for the networks presented in chapters 5, 6, and 7. However, derivations and proofs (when included) are presented at the end of a section or chapter, SO that they can be skipped without loss of continuity. The order of presentation of the topics was chosen to reflect increasing complexity of the networks. The material in each chapter is largely independent, SO that the chapters (after the first chapter) may be used in almost any order desired. The McCulloch-Pitts neuron discussed at the end of Chapter 1 provides a simple example of an early neural net. Single layer nets for pattern classification and pattern association, covered in chapters 2 and 3, are two of the earliest ap- plications of neural networks with adaptive weights. More complex networks, discussed in later chapters, are also used for these types of problems, as well as for more general mapping problems. Chapter 6, backpropagation, can logically follow chapter 2, although the networks in chapters 3-5 are somewhat simpler in structure. Chapters 4 and 5 treat networks for clustering problems (and mapping networks that are based on these clustering networks). Chapter 7 presents a few of the most widely used of the many other neural networks, including two for constrained optimization problems. Algorithms, rather than computer codes, are provided to encourage the reader to develop a thorough understanding of the mechanisms of training and applying the neural network, rather than fostering the more superficial familiarity that sometimes results from using completely developed software packages. For many applications, the formulation of the problem for solution by a neural network (and choice of an appropriate network) requires the detailed understanding of the networks that cornes from performing both hand calculations and developing com- puter codes for extremely simple examples.
  • 11. Acknowledgments Many people have helped to make this book a reality. 1 can only mention a few of them here. 1 have benefited either directly or indirectly from short courses on neural networks taught by Harold Szu, Robert Hecht-Nielsen, Steven Rogers, Bernard Widrow, and Tony Martinez. My thanks go also to my colleagues for stimulating discussions and en- couragement, especially Harold K. Brown, Barry Grossman, Fred Ham, De- metrios Lainiotis, Moti Schneider, Nazif Tepedelenlioglu, and Mike Thursby. My students have assisted in the development of this book in many ways; several of the examples are based on student work. Joe Vandeville, Alan Lindsay, and Francisco Gomez performed the computations for many of the examples in Chapter 2. John Karp provided the results for Example 4.8. Judith Lipofsky did Examples 4.9 and 4.10. Fred Parker obtained the results shown in Examples 4.12 and 4.13. Joseph Oslakovic performed the computations for several of the ex- amples in Chapter 5. Laurie Walker assisted in the development of the backpro- pagation program for several of the examples in Chapter 6; Ti-Cheng Shih did the computations for Example 6.5; Abdallah Said developed the logarithmic activation function used in Examples 6.7 and 6.8. Todd Kovach, Robin Schumann, and Hong-wei Du assisted with the Boltzmann machine and Hopfield net examples in Chapter 7; Ki-suck Yoo provided Example 7.8. Several of the network architecture diagrams are adapted from the original publications as referenced in the text. The spanning tree test data (Figures 4.11, 4.12, 5.11, and 5.12) are used with permission from Springer-Verlag. The illus- trations of modified Hebbian learning have been adapted from the original pub- xv
  • 12. xvi Acknowledgments lications: Figure 7.10 has been adapted from Hertz, Krogh, Palmer, Introduction to the Theory of Neural Computation, @ 1991 by Addison-Wesley Publishing Company, Inc. Figure 7.11 has been adapted and reprinted from Neural Networks, Vol. 5, Xu, Oja, and Suen, Modified Hebbian Leaming for Curve and Surface Fitting, pp. 441-457, 1992 with permission from Pergamon Press Ltd, Headington Hi11 Hall, Oxford 0X3 OBW, UK. Several of the figures for the neocognitron are adapted from (Fukushima, et al., 1983); they are used with permission of IEEE. The diagrams of the ART2 architecture are used with permission of the Optical Society of America, and Carpenter and Grossberg. The diagrams of the simple recurrent net for learning a context sensitive grammar (Servan-Schreiber, et al., 1989) are used with the permission of the authors. The preparation of the manuscript and software for the examples has been greatly facilitated by the use of a Macintosh IIci furnished by Apple Computers under the AppleSeed project. 1 thank Maurice Kurtz for making it available to me. 1 appreciate the constructive and encouraging comments of the manuscript reviewers: Stanley Ahalt, The Ohio State University; Peter Anderson, Rochester Institute of Technology; and Nirmal Bose, Penn State University. 1 would like to thank the Prentice-Hall editorial staff, and especially Rick DeLorenzo, for their diligent efforts to produce an accurate and attractive product within the inevitable time and budget constraints. But first, last, and always, 1 would like to thank my husband and colleague, Don Fausett for introducing me to neural networks, and for his patience, en- couragement, and advice when asked, during the writing of this book (as well as other times).
  • 13. FUNDAMENTALS , OF NEURAL NETWORKS
  • 14.
  • 15. 1.1 WHY NEURAL NETWORKS AND WHY NOW? As modern computers become ever more powerful, scientists continue to be chal- lenged to use machines effectively for tasks that are relatively simple for humans. Based on examples, together with some feedback from a “teacher,” we learn easily to recognize the letter A or distinguish a cat from a bird. More experience allows us to refine our responses and improve our performance. Although even- tually, we may be able to describe rules by which we can make such decisions, these do not necessarily reflect the actual process we use. Even without a teacher, we can group similar patterns together. Yet another common human activity is trying to achieve a goal that involves maximizing a resource (time with one’s family, for example) while satisfying certain constraints (such as the need to earn a living). Each of these types of problems illustrates tasks for which computer solutions may be sought. Traditional, sequential, logic-based digital computing excels in many areas, but has been less successful for other types of problems. The development of artificial neural networks began approximately 50 years ago, motivated by a desire to try both to understand the brain and to emulate some of its strengths. Early 1