Neural Networks Ver1

3,735 views

Published on

final Year Projects, Final Year Projects in Chennai, Software Projects, Embedded Projects, Microcontrollers Projects, DSP Projects, VLSI Projects, Matlab Projects, Java Projects, .NET Projects, IEEE Projects, IEEE 2009 Projects, IEEE 2009 Projects, Software, IEEE 2009 Projects, Embedded, Software IEEE 2009 Projects, Embedded IEEE 2009 Projects, Final Year Project Titles, Final Year Project Reports, Final Year Project Review, Robotics Projects, Mechanical Projects, Electrical Projects, Power Electronics Projects, Power System Projects, Model Projects, Java Projects, J2EE Projects, Engineering Projects, Student Projects, Engineering College Projects, MCA Projects, BE Projects, BTech Projects, ME Projects, MTech Projects, Wireless Networks Projects, Network Security Projects, Networking Projects, final year projects, ieee projects, student projects, college projects, ieee projects in chennai, java projects, software ieee projects, embedded ieee projects, "ieee2009projects", "final year projects", "ieee projects", "Engineering Projects", "Final Year Projects in Chennai", "Final year Projects at Chennai", Java Projects, ASP.NET Projects, VB.NET Projects, C# Projects, Visual C++ Projects, Matlab Projects, NS2 Projects, C Projects, Microcontroller Projects, ATMEL Projects, PIC Projects, ARM Projects, DSP Projects, VLSI Projects, FPGA Projects, CPLD Projects, Power Electronics Projects, Electrical Projects, Robotics Projects, Solor Projects, MEMS Projects, J2EE Projects, J2ME Projects, AJAX Projects, Structs Projects, EJB Projects, Real Time Projects, Live Projects, Student Projects, Engineering Projects, MCA Projects, MBA Projects, College Projects, BE Projects, BTech Projects, ME Projects, MTech Projects, M.Sc Projects, Final Year Java Projects, Final Year ASP.NET Projects, Final Year VB.NET Projects, Final Year C# Projects, Final Year Visual C++ Projects, Final Year Matlab Projects, Final Year NS2 Projects, Final Year C Projects, Final Year Microcontroller Projects, Final Year ATMEL Projects, Final Year PIC Projects, Final Year ARM Projects, Final Year DSP Projects, Final Year VLSI Projects, Final Year FPGA Projects, Final Year CPLD Projects, Final Year Power Electronics Projects, Final Year Electrical Projects, Final Year Robotics Projects, Final Year Solor Projects, Final Year MEMS Projects, Final Year J2EE Projects, Final Year J2ME Projects, Final Year AJAX Projects, Final Year Structs Projects, Final Year EJB Projects, Final Year Real Time Projects, Final Year Live Projects, Final Year Student Projects, Final Year Engineering Projects, Final Year MCA Projects, Final Year MBA Projects, Final Year College Projects, Final Year BE Projects, Final Year BTech Projects, Final Year ME Projects, Final Year MTech Projects, Final Year M.Sc Projects, IEEE Java Projects, ASP.NET Projects, VB.NET Projects, C# Projects, Visual C++ Projects, Matlab Projects, NS2 Projects, C Projects, Microcontroller Projects, ATMEL Projects, PIC Projects, ARM Projects, DSP Projects, VLSI Projects, FPGA Projects, CPLD Projects, Power Electronics Projects, Electrical Projects, Robotics Projects, Solor Projects, MEMS Projects, J2EE Projects, J2ME Projects, AJAX Projects, Structs Projects, EJB Projects, Real Time Projects, Live Projects, Student Projects, Engineering Projects, MCA Projects, MBA Projects, College Projects, BE Projects, BTech Projects, ME Projects, MTech Projects, M.Sc Projects, IEEE 2009 Java Projects, IEEE 2009 ASP.NET Projects, IEEE 2009 VB.NET Projects, IEEE 2009 C# Projects, IEEE 2009 Visual C++ Projects, IEEE 2009 Matlab Projects, IEEE 2009 NS2 Projects, IEEE 2009 C Projects, IEEE 2009 Microcontroller Projects, IEEE 2009 ATMEL Projects, IEEE 2009 PIC Projects, IEEE 2009 ARM Projects, IEEE 2009 DSP Projects, IEEE 2009 VLSI Projects, IEEE 2009 FPGA Projects, IEEE 2009 CPLD Projects, IEEE 2009 Power Electronics Projects, IEEE 2009 Electrical Projects, IEEE 2009 Robotics Projects, IEEE 2009 Solor Projects, IEEE 2009 MEMS Projects, IEEE 2009 J2EE P

Published in: Technology
1 Comment
7 Likes
Statistics
Notes
No Downloads
Views
Total views
3,735
On SlideShare
0
From Embeds
0
Number of Embeds
9
Actions
Shares
0
Downloads
393
Comments
1
Likes
7
Embeds 0
No embeds

No notes for slide

Neural Networks Ver1

  1. 1. NCCT Centre for Advanced Technology ------------------------------------------------------------------------------------------------------------------------------------------------------------------------ SOFTWARE DEVELOPMENT * EMBEDDED SYSTEMS #109, 2nd Floor, Bombay Flats, Nungambakkam High Road, Nungambakkam, Chennai - 600 034. Phone - 044 - 2823 5816, 98412 32310 E-Mail: ncct@eth.net, esskayn@eth.net, URL: ncctchennai.com Dedicated to Commitments, Committed to Technologies
  2. 2. NEURAL NETWORKS & ITS APPLICATIONS NCCT Where Technology and Solutions Meet
  3. 3. INTRODUCTION <ul><li>The purpose is to make a technical presentation on NEURAL NETWORKS & ITS APPLICATIONS </li></ul> NCCT
  4. 4. About NCCT <ul><li>NCCT is a leading IT organization backed by a strong R & D, concentrating on Software Development & Electronics product development. </li></ul><ul><li>The major activities of NCCT includes System Software Design and Development, Networking and Communication, Enterprise computing, Application Software Development, Web Technologies Development </li></ul> NCCT
  5. 5. WHAT WILL WE DISCUSS <ul><li>Machine learning and human brain </li></ul><ul><li>Introduction to Neural Networks </li></ul><ul><li>Computer neurons </li></ul><ul><li>Architecture of Neural Networks </li></ul><ul><li>Need for Neural Networks </li></ul><ul><li>Uses of Neural Networks </li></ul><ul><li>Algorithms </li></ul><ul><li>Applications </li></ul> NCCT
  6. 6. MACHINE LEARNING <ul><li>Machine learning involves adaptive mechanisms that enable computers to learn from experience, learn by example and learn by analogy </li></ul><ul><li>Learning capabilities can improve the performance of an intelligent system over time </li></ul><ul><li>The most popular approaches to machine learning are Artificial Neural Networks and Genetic Algorithms </li></ul><ul><li>This session is dedicated to NEURAL NETWORKS </li></ul>
  7. 7. <ul><li>SUPERVISED LEARNING </li></ul><ul><ul><li>Recognizing hand-written digits, pattern recognition, regression. </li></ul></ul><ul><ul><li>Labeled examples (input , desired output) </li></ul></ul><ul><ul><li>Neural Network models: perceptron, feed-forward, radial basis function, support vector machine. </li></ul></ul><ul><li>UNSUPERVISED LEARNING </li></ul><ul><ul><li>Find similar groups of documents in the web, content addressable memory, clustering. </li></ul></ul><ul><ul><li>Unlabeled examples (different realizations of the input alone) </li></ul></ul><ul><ul><li>Neural Network models: self organizing maps, Hopfield networks . </li></ul></ul>LEARNING NCCT
  8. 8. BRAIN AND MACHINE <ul><li>THE BRAIN </li></ul><ul><ul><li>Pattern Recognition </li></ul></ul><ul><ul><li>Association </li></ul></ul><ul><ul><li>Complexity </li></ul></ul><ul><ul><li>Noise Tolerance </li></ul></ul><ul><li>THE MACHINE </li></ul><ul><ul><li>Calculation </li></ul></ul><ul><ul><li>Precision </li></ul></ul><ul><ul><li>Logic </li></ul></ul>
  9. 9. The Contrast in Architecture <ul><li>The Von Neumann architecture uses a single processing unit; </li></ul><ul><ul><li>Tens of millions of operations per second </li></ul></ul><ul><ul><li>Absolute arithmetic precision </li></ul></ul><ul><li>The brain uses many slow unreliable processors acting in parallel </li></ul>
  10. 10. Features of the Brain <ul><li>Ten billion neurons </li></ul><ul><li>Average several thousand connections </li></ul><ul><li>Hundreds of operations per second </li></ul><ul><li>Reliability low </li></ul><ul><li>Die off frequently (never replaced) </li></ul><ul><li>Compensates for problems by massive parallelism </li></ul>
  11. 11. The Biological Inspiration <ul><li>The brain has been extensively studied by scientists. </li></ul><ul><li>Vast complexity prevents all but rudimentary understanding. </li></ul><ul><li>Even the behaviour of an individual neuron is extremely complex </li></ul><ul><li>Single “percepts” distributed among many neurons </li></ul><ul><li>Localized parts of the brain are responsible for certain well-defined functions (e.g.. vision, motion). </li></ul><ul><li>Which features are integral to the brain's performance? </li></ul><ul><li>Which are incidentals imposed by the fact of biology? </li></ul>
  12. 12. WHAT ARE NEURAL NETWORKS <ul><li>A NEURAL NETWORK can be defined as a model of reasoning based on the human brain. </li></ul><ul><li>The brain consists of a densely interconnected set of nerve cells, or basic information - processing units, called neurons. </li></ul><ul><li>The human brain incorporates nearly 10 billion neurons and 60 trillion connections, synapses, between them. </li></ul><ul><li>By using multiple neurons simultaneously, the brain can perform its functions much faster than the fastest computers in existence today </li></ul>
  13. 13. <ul><li>A NEURON consists of a cell body, soma, a number of fibres called dendrites, and a single long fibre called the axon </li></ul><ul><li>Each neuron has a very simple structure, but an army of such elements constitutes a tremendous processing power </li></ul><ul><li>The neurons are connected by weighted links passing signals from one neuron to another </li></ul><ul><li>Neural Networks are a type of artificial intelligence that attempts to imitate the way a human brain works. </li></ul>WHAT ARE NEURAL NETWORKS
  14. 14. <ul><li>Rather than using a digital model, in which all computations manipulate zeros and ones, a neural network works by creating connections between processing elements, the computer equivalent of neurons. </li></ul><ul><li>The organization and weights of the connections determine the output </li></ul><ul><li>Information is stored and processed in a neural network simultaneously throughout the whole network, rather than at specific locations </li></ul><ul><li>In other words, in neural networks, both data and its processing are global rather than local </li></ul>WHAT ARE NEURAL NETWORKS
  15. 15. BIOLOGICAL NEURAL NETWORK NCCT
  16. 16. The Neuron as a Simple Computing Element DIAGRAM OF A NEURON
  17. 17. Analogy between Biological and Artificial Neural Networks
  18. 18. ARCHITECTURE OF A TYPICAL ARTIFICIAL NEURAL NETWORK
  19. 19. USES OF NEURAL NETWORK <ul><li>Neural networks are used for both regression and classification. </li></ul><ul><li>Regression is a function approximation and time series prediction. </li></ul><ul><li>Classification, the objective is to assign the input patterns to one of the several categories or classes, usually represented by outputs restricted to lie in the range from 0 to 1. </li></ul>
  20. 20. WHY NEURAL NETWORKS ? <ul><li>It is well proven that function approximation gives better results than the classical regression techniques. </li></ul><ul><li>Could work very well for non linear systems. </li></ul> NCCT
  21. 21. SIMPLE EXPLANATION HOW NEURAL NETWORK WORKS <ul><li>Neural Networks use a set of processing elements (or nodes) loosely analogous to neurons in the brain </li></ul><ul><li>These nodes are interconnected in a network that can then identify patterns in data as it is exposed to the data, In a sense, the network learns from an experience just as people do </li></ul><ul><li>This distinguishes neural networks from traditional computing programs, that simply follow instructions in a fixed sequential order. </li></ul> NCCT
  22. 22. SIMPLE EXPLANATION HOW NEURAL NETWORK WORKS <ul><li>The bottom layer represents the input layer, in this case with 5 inputs labelled X1 through X5. </li></ul><ul><li>In the middle is something called the hidden layer, with a variable number of nodes. It is the hidden layer that performs much of the work of the network. </li></ul><ul><li>The output layer in this case has two nodes, Z1 and Z2 representing output values we are trying to determine from the inputs. </li></ul><ul><li>For example, we may be trying to predict sales (output) based on past sales, price and season (input). </li></ul>The structure of a neural network looks something like th is image
  23. 23. SIMPLE EXPLANATION HIDDEN LAYER <ul><li>Each node in the hidden layer is fully connected to the inputs. That means what is learned in a hidden node is based on all the inputs taken together </li></ul><ul><li>This hidden layer is where the network learns interdependencies in the model </li></ul><ul><li>The following diagram provides some detail into what goes on inside a hidden node </li></ul>More on the Hidden Layer
  24. 24. SIMPLE EXPLANATION HIDDEN LAYER <ul><li>Simply speaking a weighted sum is performed: X1 times W1 plus X2 times W2 on through X5 and W5 </li></ul><ul><li>This weighted sum is performed for each hidden node and each output node and is how interactions are represented in the network </li></ul><ul><li>Each summation is then transformed using a nonlinear function before the value is passed on to the next layer. </li></ul>More on the Hidden Layer
  25. 25. HEBBIAN LEARNING <ul><li>TWO NEURONS REPRESENT TWO CONCEPTS </li></ul><ul><ul><li>Synaptic strength between them indicates the strength of association of concepts; </li></ul></ul><ul><li>HEBBIAN LEARNING </li></ul><ul><ul><li>Connections are strengthened whenever two concepts occur together; </li></ul></ul><ul><li>PAVLOVIAN CONDITIONING </li></ul><ul><ul><li>An animal is trained to associate two events </li></ul></ul><ul><ul><li>i.e. dinner is served after going in the rings </li></ul></ul>
  26. 26. CAN A SINGLE NEURON LEARN A TASK? <ul><li>In 1958, Frank Rosenblatt introduced a training algorithm that provided the first procedure for training a simple ANN: a perceptron </li></ul><ul><li>The perceptron is the simplest form of a neural network. It consists of a single neuron with adjustable synaptic weights and a hard limiter </li></ul>
  27. 27. THE PERCEPTRON <ul><li>The operation of Rosenblatt’s perceptron is based on the McCulloch and Pitts neuron model. The model consists of a linear combiner followed by a hard limiter </li></ul><ul><li>The weighted sum of the inputs is applied to the hard limiter, which produces an output equal to +1 if its input is positive and  1 if it is negative. </li></ul>
  28. 28. SINGLE-LAYER TWO-INPUT PERCEPTRON
  29. 29. <ul><li>This is done by making small adjustments in the weights to reduce the difference between the actual and desired outputs of the perceptron. </li></ul><ul><li>The initial weights are randomly assigned, usually in the range [  0.5, 0.5], and then updated to obtain the output consistent with the training examples. </li></ul>How does the perceptron learn its classification tasks?
  30. 30. <ul><li>If at iteration p , the actual output is Y ( p ) and the desired output is Y d ( p ), then the error is given by: </li></ul><ul><li> where p = 1, 2, 3, . . . </li></ul><ul><li>Iteration p here refers to the p th training example presented to the perceptron. </li></ul><ul><li>If the error, e ( p ), is positive, we need to increase perceptron output Y ( p ), but if it is negative, we need to decrease Y ( p ). </li></ul>How does the perceptron learn its classification tasks?
  31. 31. THE PERCEPTRON LEARNING RULE where p = 1, 2, 3, . . .  is the learning rate, a positive constant less than unity. The perceptron learning rule was first proposed by Rosenblatt in 1960. Using this rule we can derive the perceptron training algorithm for classification tasks.
  32. 32. STEP 1: INITIALISATION Set initial weights w 1, w 2,…, wn and threshold  to random numbers in the range [  0.5, 0.5]. PERCEPTRON’S TRAINING ALGORITHM STEP 2: ACTIVATION Activate the perceptron by applying inputs x1(p), x2(p),…, xn(p) and desired output Yd (p). Calculate the actual output at iteration p = 1 where n is the number of the perceptron inputs, and step is a step activation function.
  33. 33. STEP 3: WEIGHT TRAINING Update the weights of the perceptron where is the weight correction at iteration p. The weight correction is computed by the delta rule: where STEP 4: ITERATION Increase iteration p by one, go back to Step 2 and repeat the process until convergence. PERCEPTRON’S TRAINING ALGORITHM
  34. 34. <ul><li>The neuron computes the weighted sum of the input signals and compares the result with a threshold value,  . </li></ul><ul><li>If the net input is less than the threshold, the neuron output is –1. But if the net input is greater than or equal to the threshold, the neuron becomes activated and its output attains a value +1. </li></ul><ul><li>The neuron uses the following transfer or activation function: </li></ul><ul><li>This type of activation function is called a sign function. </li></ul>NEURON COMPUTATION
  35. 35. ACTIVATION FUNCTIONS
  36. 36. EXAMPLE A neuron uses a step function as its activation function q = 0.2 and W1 = 0.1, W2 = 0.4, What is the output with the following values of x1 and x2: 1 1 0 1 1 0 0 0 Y x2 x1
  37. 37. NETWORK STRUCTURE <ul><li>The output signal is transmitted through the neuron’s outgoing connection. </li></ul><ul><li>The outgoing connection splits into a number of branches that transmit the same signal. </li></ul><ul><li>The outgoing branches terminate at the incoming connections of other neurons in the network. </li></ul> NCCT
  38. 38. NETWORK ARCHITECTURES <ul><li>THREE DIFFERENT CLASSES OF NETWORK ARCHITECTURES </li></ul><ul><ul><li>Single-layer Feed-forward </li></ul></ul><ul><ul><li>Multi-layer Feed-forward </li></ul></ul><ul><ul><li>Recurrent </li></ul></ul><ul><ul><li>The ARCHITECTURE of a neural network is linked with the learning algorithm used to train </li></ul></ul>Neurons are organized in a Cyclic Layers
  39. 39. NETWORK ARCHITECTURES SINGLE LAYER FEED FORWARD Input layer of source nodes Output layer of neurons NCCT
  40. 40. NETWORK ARCHITECTURES MULTI LAYER FEED-FORWARD INPUT LAYER OUTPUT LAYER HIDDEN LAYER 3-4-2 NETWORK NCCT
  41. 41. <ul><li>Recurrent Network with hidden neuron(s): unit delay operator z -1 implies dynamic system </li></ul>RECURRENT NETWORK INPUT HIDDEN OUTPUT z -1 z -1 z -1
  42. 42. NEURAL NETWORK ARCHITECTURES
  43. 43. NEURAL NETWORK APPLICATIONS <ul><li>Biomedical Applications </li></ul><ul><li>Business Forecasting Applications </li></ul><ul><li>Demand Analysis and Forecasting </li></ul><ul><li>Marketing Applications </li></ul><ul><li>Financial Applications </li></ul><ul><li>Space Research </li></ul><ul><li>Psychiatric Diagnosis </li></ul> NCCT
  44. 44. FACE RECOGNITION 90% accurate learning head pose, and recognizing 1-of-20 faces
  45. 45. HANDWRITTEN DIGIT RECOGNITION
  46. 46. Projects @ NCCT Redefining the Learning Specialization, Design, Development and Implementation with Projects Experience the learning with the latest new tools and technologies…
  47. 47. Projects @ NCCT Project Specialization Concept <ul><li>NCCT , in consultation with Export-Software Division, offers Live Electronics related Projects, to experience the learning with the latest new tools and technologies </li></ul><ul><li>NCCT believes in specialized Hardware Design, development training and implementation with an emphasis on development principles and standards </li></ul><ul><li>NCCT plays a dual positive role by satisfying your academic requirements as well as giving the necessary training in electronics and embedded product development </li></ul> NCCT
  48. 48. Projects @ NCCT <ul><li>WE ARE OFFERING PROJECTS FOR THE FOLLOWING DISCIPLINES </li></ul><ul><li>COMPUTER SCIENCE AND ENGINEERING </li></ul><ul><li>INFORMATION TECHNOLOGY </li></ul><ul><li>ELECTRONICS AND COMMUNICATION ENGINEERING </li></ul><ul><li>ELECTRICAL AND ELECTRONICS ENGINEERING </li></ul><ul><li>ELECTRONICS AND INSTRUMENTATION </li></ul><ul><li>MECHANICAL AND MECHATRONICS </li></ul>
  49. 49. Projects @ NCCT <ul><li>PROJECTS IN THE AREAS OF </li></ul><ul><li>System Software Development </li></ul><ul><li>Application Software Development, Porting </li></ul><ul><li>Networking & Communication related </li></ul><ul><li>Data Mining, Neural Networks, Fuzzy Logic, AI based </li></ul><ul><li>Bio Medical related </li></ul><ul><li>Web & Internet related </li></ul><ul><li>Embedded Systems - Microcontrollers, VLSI, DSP, RTOS </li></ul><ul><li>WAP, Web enabled Internet Applications </li></ul><ul><li>UNIX LINUX based Projects </li></ul>
  50. 50. Projects @ NCCT <ul><li>SAMPLE PROJECTS @ NCCT </li></ul><ul><li>ANN TECHNOLOGY </li></ul><ul><li>CHARACTER AND PATTERN RECOGNITION USING NEURAL NETWORKS </li></ul> NCCT
  51. 51. Projects @ NCCT <ul><li>BRIEF IDEA </li></ul><ul><li>TO DETERMINE HANDWRITTEN CHARECTERS USING ARTIFICIAL NEURAL NETWORKS </li></ul><ul><li>FEATURES </li></ul><ul><li>USING ANN TECHNOLOGY </li></ul><ul><li>ACCURACY </li></ul><ul><li>EASY TO IMPLEMENT </li></ul><ul><li>FOOL PROOF MECHANISM </li></ul> NCCT
  52. 52. Projects @ NCCT <ul><li>SAMPLE PROJECTS @ NCCT </li></ul><ul><li>NEURAL NETWORK BASED MEDICAL SYSTEMS </li></ul><ul><li>NEURAL NETWORK BASED DIAGNOSTIC SYSTEM </li></ul>
  53. 53. Projects @ NCCT <ul><li>BRIEF IDEA </li></ul><ul><li>FORECASTING FETAL HEART BEATS USING NEURAL NETWORKS </li></ul><ul><li>COMBINES INPUT WINDOWS< HIDDEN LAYERS, FEEDBACK AND SELF RECURRENT UNIT </li></ul><ul><li>FEATURES </li></ul><ul><li>ADDITIONAL SELF RECURRENT INPUT </li></ul><ul><li>COMBINES SEVERAL TECHNIQUES </li></ul><ul><li>FOR PROCESSING TEMPORAL ASPECTS OF THE INPUT SEQUENCE </li></ul>
  54. 54. Placements @ NCCT NCCT has an enormous placement wing, which enrolls all candidates in its placement bank, and will keep in constant touch with various IT related industries in India / Abroad, who are in need of computer trained quality manpower Each candidate goes through complete pre-placement session before placement made by NCCT The placement division also helps students in getting projects and organize guest lectures, group discussions, soft learning skills, mock interviews, personality development skills, easy learning skills, technical discussions, student meetings, etc., For every student we communicate the IT organizations, with the following documents * Curriculum highlighting the skills * A brief write up of the software knowledge acquired at NCCT, syllabus taught at NCCT * Projects and Specialization work done at NCCT * Additional skills learnt
  55. 55. NCCT THE FOLLOWING SKILL SET IS SECURE
  56. 56. NCCT Quality is Our Responsibility Dedicated to Commitments and Committed to Technology

×