SlideShare a Scribd company logo

19_Learning.ppt

neural network learning

1 of 31
Download to read offline
Learning in Neural Networks
Neurons and the Brain
Neural Networks
Perceptrons
Multi-layer Networks
Applications
The Hopfield Network
Neural Networks
A model of reasoning based on the human brain
complex networks of simple computing elements
capable of learning from examples
 with appropriate learning methods
collection of simple elements performs high-level
operations
Neural Networks and the Brain (Cont.)
 The human brain incorporates nearly 10 billion
neurons and 60 trillion connections between them.
 Our brain can be considered as a highly complex,
non-linear and parallel information-processing
system.
 Learning is a fundamental and essential
characteristic of biological neural networks.
Artificial Neuron (Perceptron) Diagram
 weighted inputs are summed up by the input function
 the (nonlinear) activation function calculates the activation
value, which determines the output
[Russell & Norvig, 1995]
Common Activation Functions
Stept(x) = 1 if x >= t, else 0
Sign(x) = +1 if x >= 0, else –1
Sigmoid(x) = 1/(1+e-x)
[Russell & Norvig, 1995]
Neural Networks and Logic Gates
simple neurons can act as logic gates
 appropriate choice of activation function, threshold, and
weights
 step function as activation function
[Russell & Norvig, 1995]

Recommended

ACUMENS ON NEURAL NET AKG 20 7 23.pptx
ACUMENS ON NEURAL NET AKG 20 7 23.pptxACUMENS ON NEURAL NET AKG 20 7 23.pptx
ACUMENS ON NEURAL NET AKG 20 7 23.pptxgnans Kgnanshek
 
lecture07.ppt
lecture07.pptlecture07.ppt
lecture07.pptbutest
 
Artificial neural networks
Artificial neural networks Artificial neural networks
Artificial neural networks ShwethaShreeS
 
Soft Computing-173101
Soft Computing-173101Soft Computing-173101
Soft Computing-173101AMIT KUMAR
 
SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1SOFT COMPUTERING TECHNICS -Unit 1
SOFT COMPUTERING TECHNICS -Unit 1sravanthi computers
 
8_Neural Networks in artificial intelligence.ppt
8_Neural Networks in artificial intelligence.ppt8_Neural Networks in artificial intelligence.ppt
8_Neural Networks in artificial intelligence.pptssuser7e63fd
 
Neural-Networks.ppt
Neural-Networks.pptNeural-Networks.ppt
Neural-Networks.pptRINUSATHYAN
 

More Related Content

Similar to 19_Learning.ppt

Supervised Learning
Supervised LearningSupervised Learning
Supervised Learningbutest
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1ncct
 
Neural networks
Neural networksNeural networks
Neural networksBasil John
 
st-m-hdstat-courese rnn-deep-learning.pdf
st-m-hdstat-courese rnn-deep-learning.pdfst-m-hdstat-courese rnn-deep-learning.pdf
st-m-hdstat-courese rnn-deep-learning.pdfgamajima2023
 
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...DurgadeviParamasivam
 
Acem neuralnetworks
Acem neuralnetworksAcem neuralnetworks
Acem neuralnetworksAastha Kohli
 
2011 0480.neural-networks
2011 0480.neural-networks2011 0480.neural-networks
2011 0480.neural-networksParneet Kaur
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)EdutechLearners
 
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience hirokazutanaka
 
Artificial neural network by arpit_sharma
Artificial neural network by arpit_sharmaArtificial neural network by arpit_sharma
Artificial neural network by arpit_sharmaEr. Arpit Sharma
 
Artificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxArtificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxpratik610182
 
Artificial Neural Networks-Supervised Learning Models
Artificial Neural Networks-Supervised Learning ModelsArtificial Neural Networks-Supervised Learning Models
Artificial Neural Networks-Supervised Learning ModelsDrBaljitSinghKhehra
 
Artificial Neural Networks-Supervised Learning Models
Artificial Neural Networks-Supervised Learning ModelsArtificial Neural Networks-Supervised Learning Models
Artificial Neural Networks-Supervised Learning ModelsDrBaljitSinghKhehra
 
Artificial Neural Networks-Supervised Learning Models
Artificial Neural Networks-Supervised Learning ModelsArtificial Neural Networks-Supervised Learning Models
Artificial Neural Networks-Supervised Learning ModelsDrBaljitSinghKhehra
 

Similar to 19_Learning.ppt (20)

Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
Supervised Learning
Supervised LearningSupervised Learning
Supervised Learning
 
ANN.pptx
ANN.pptxANN.pptx
ANN.pptx
 
ANN.ppt
ANN.pptANN.ppt
ANN.ppt
 
Neural Networks Ver1
Neural  Networks  Ver1Neural  Networks  Ver1
Neural Networks Ver1
 
Neural networks
Neural networksNeural networks
Neural networks
 
st-m-hdstat-courese rnn-deep-learning.pdf
st-m-hdstat-courese rnn-deep-learning.pdfst-m-hdstat-courese rnn-deep-learning.pdf
st-m-hdstat-courese rnn-deep-learning.pdf
 
10-Perceptron.pdf
10-Perceptron.pdf10-Perceptron.pdf
10-Perceptron.pdf
 
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
BACKPROPOGATION ALGO.pdfLECTURE NOTES WITH SOLVED EXAMPLE AND FEED FORWARD NE...
 
071bct537 lab4
071bct537 lab4071bct537 lab4
071bct537 lab4
 
Acem neuralnetworks
Acem neuralnetworksAcem neuralnetworks
Acem neuralnetworks
 
2011 0480.neural-networks
2011 0480.neural-networks2011 0480.neural-networks
2011 0480.neural-networks
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)
 
6
66
6
 
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
JAISTサマースクール2016「脳を知るための理論」講義04 Neural Networks and Neuroscience
 
Artificial neural network by arpit_sharma
Artificial neural network by arpit_sharmaArtificial neural network by arpit_sharma
Artificial neural network by arpit_sharma
 
Artificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptxArtificial Neural Network_VCW (1).pptx
Artificial Neural Network_VCW (1).pptx
 
Artificial Neural Networks-Supervised Learning Models
Artificial Neural Networks-Supervised Learning ModelsArtificial Neural Networks-Supervised Learning Models
Artificial Neural Networks-Supervised Learning Models
 
Artificial Neural Networks-Supervised Learning Models
Artificial Neural Networks-Supervised Learning ModelsArtificial Neural Networks-Supervised Learning Models
Artificial Neural Networks-Supervised Learning Models
 
Artificial Neural Networks-Supervised Learning Models
Artificial Neural Networks-Supervised Learning ModelsArtificial Neural Networks-Supervised Learning Models
Artificial Neural Networks-Supervised Learning Models
 

More from gnans Kgnanshek

MICROCONTROLLER EMBEDDED SYSTEM IOT 8051 TIMER.pptx
MICROCONTROLLER EMBEDDED SYSTEM IOT 8051 TIMER.pptxMICROCONTROLLER EMBEDDED SYSTEM IOT 8051 TIMER.pptx
MICROCONTROLLER EMBEDDED SYSTEM IOT 8051 TIMER.pptxgnans Kgnanshek
 
EDC SLIDES PRESENTATION PRINCIPAL ON WEDNESDAY.pptx
EDC SLIDES PRESENTATION PRINCIPAL ON WEDNESDAY.pptxEDC SLIDES PRESENTATION PRINCIPAL ON WEDNESDAY.pptx
EDC SLIDES PRESENTATION PRINCIPAL ON WEDNESDAY.pptxgnans Kgnanshek
 
Lecture_3_Gradient_Descent.pptx
Lecture_3_Gradient_Descent.pptxLecture_3_Gradient_Descent.pptx
Lecture_3_Gradient_Descent.pptxgnans Kgnanshek
 
Batch_Normalization.pptx
Batch_Normalization.pptxBatch_Normalization.pptx
Batch_Normalization.pptxgnans Kgnanshek
 
33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdf33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdfgnans Kgnanshek
 
unit-1 MANAGEMENT AND ORGANIZATIONS.pptx
unit-1 MANAGEMENT AND ORGANIZATIONS.pptxunit-1 MANAGEMENT AND ORGANIZATIONS.pptx
unit-1 MANAGEMENT AND ORGANIZATIONS.pptxgnans Kgnanshek
 
3 AGRI WORK final review slides.pptx
3 AGRI WORK final review slides.pptx3 AGRI WORK final review slides.pptx
3 AGRI WORK final review slides.pptxgnans Kgnanshek
 

More from gnans Kgnanshek (19)

MICROCONTROLLER EMBEDDED SYSTEM IOT 8051 TIMER.pptx
MICROCONTROLLER EMBEDDED SYSTEM IOT 8051 TIMER.pptxMICROCONTROLLER EMBEDDED SYSTEM IOT 8051 TIMER.pptx
MICROCONTROLLER EMBEDDED SYSTEM IOT 8051 TIMER.pptx
 
EDC SLIDES PRESENTATION PRINCIPAL ON WEDNESDAY.pptx
EDC SLIDES PRESENTATION PRINCIPAL ON WEDNESDAY.pptxEDC SLIDES PRESENTATION PRINCIPAL ON WEDNESDAY.pptx
EDC SLIDES PRESENTATION PRINCIPAL ON WEDNESDAY.pptx
 
types of research.pptx
types of research.pptxtypes of research.pptx
types of research.pptx
 
CS8601 4 MC NOTES.pdf
CS8601 4 MC NOTES.pdfCS8601 4 MC NOTES.pdf
CS8601 4 MC NOTES.pdf
 
CS8601 3 MC NOTES.pdf
CS8601 3 MC NOTES.pdfCS8601 3 MC NOTES.pdf
CS8601 3 MC NOTES.pdf
 
CS8601 2 MC NOTES.pdf
CS8601 2 MC NOTES.pdfCS8601 2 MC NOTES.pdf
CS8601 2 MC NOTES.pdf
 
CS8601 1 MC NOTES.pdf
CS8601 1 MC NOTES.pdfCS8601 1 MC NOTES.pdf
CS8601 1 MC NOTES.pdf
 
Lecture_3_Gradient_Descent.pptx
Lecture_3_Gradient_Descent.pptxLecture_3_Gradient_Descent.pptx
Lecture_3_Gradient_Descent.pptx
 
Batch_Normalization.pptx
Batch_Normalization.pptxBatch_Normalization.pptx
Batch_Normalization.pptx
 
33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdf33.-Multi-Layer-Perceptron.pdf
33.-Multi-Layer-Perceptron.pdf
 
11_NeuralNets.pdf
11_NeuralNets.pdf11_NeuralNets.pdf
11_NeuralNets.pdf
 
NN-Ch7.PDF
NN-Ch7.PDFNN-Ch7.PDF
NN-Ch7.PDF
 
NN-Ch6.PDF
NN-Ch6.PDFNN-Ch6.PDF
NN-Ch6.PDF
 
NN-Ch5.PDF
NN-Ch5.PDFNN-Ch5.PDF
NN-Ch5.PDF
 
NN-Ch3.PDF
NN-Ch3.PDFNN-Ch3.PDF
NN-Ch3.PDF
 
NN-Ch2.PDF
NN-Ch2.PDFNN-Ch2.PDF
NN-Ch2.PDF
 
unit-1 MANAGEMENT AND ORGANIZATIONS.pptx
unit-1 MANAGEMENT AND ORGANIZATIONS.pptxunit-1 MANAGEMENT AND ORGANIZATIONS.pptx
unit-1 MANAGEMENT AND ORGANIZATIONS.pptx
 
POM all 5 Units.pptx
POM all 5 Units.pptxPOM all 5 Units.pptx
POM all 5 Units.pptx
 
3 AGRI WORK final review slides.pptx
3 AGRI WORK final review slides.pptx3 AGRI WORK final review slides.pptx
3 AGRI WORK final review slides.pptx
 

Recently uploaded

Grades 7 to 8 Anti- OSAEC and CSAEM session.pptx
Grades 7 to 8 Anti- OSAEC and CSAEM session.pptxGrades 7 to 8 Anti- OSAEC and CSAEM session.pptx
Grades 7 to 8 Anti- OSAEC and CSAEM session.pptxGladysValencia13
 
Routes of Drug Administrations PPT..pptx
Routes of Drug Administrations PPT..pptxRoutes of Drug Administrations PPT..pptx
Routes of Drug Administrations PPT..pptxRenuka N Sunagad
 
Practical Research 1: Qualitative Research and Its Importance in Daily Life.pptx
Practical Research 1: Qualitative Research and Its Importance in Daily Life.pptxPractical Research 1: Qualitative Research and Its Importance in Daily Life.pptx
Practical Research 1: Qualitative Research and Its Importance in Daily Life.pptxKatherine Villaluna
 
2.20.24 The March on Washington for Jobs and Freedom.pptx
2.20.24 The March on Washington for Jobs and Freedom.pptx2.20.24 The March on Washington for Jobs and Freedom.pptx
2.20.24 The March on Washington for Jobs and Freedom.pptxMaryPotorti1
 
New Features in the Odoo 17 Sales Module
New Features in  the Odoo 17 Sales ModuleNew Features in  the Odoo 17 Sales Module
New Features in the Odoo 17 Sales ModuleCeline George
 
ADAPTABILITY, Types of Adaptability AND STABILITY ANALYSIS method.pptx
ADAPTABILITY, Types of Adaptability AND STABILITY ANALYSIS  method.pptxADAPTABILITY, Types of Adaptability AND STABILITY ANALYSIS  method.pptx
ADAPTABILITY, Types of Adaptability AND STABILITY ANALYSIS method.pptxAKSHAYMAGAR17
 
spring_bee_bot_creations_erd primary.pdf
spring_bee_bot_creations_erd primary.pdfspring_bee_bot_creations_erd primary.pdf
spring_bee_bot_creations_erd primary.pdfKonstantina Koutsodimou
 
Andreas Schleicher - 20 Feb 2024 - How pop music, podcasts, and Tik Tok are i...
Andreas Schleicher - 20 Feb 2024 - How pop music, podcasts, and Tik Tok are i...Andreas Schleicher - 20 Feb 2024 - How pop music, podcasts, and Tik Tok are i...
Andreas Schleicher - 20 Feb 2024 - How pop music, podcasts, and Tik Tok are i...EduSkills OECD
 
ICSE English Literature Class X Handwritten Notes
ICSE English Literature Class X Handwritten NotesICSE English Literature Class X Handwritten Notes
ICSE English Literature Class X Handwritten NotesGauri S
 
Plant Genetic Resources, Germplasm, gene pool - Copy.pptx
Plant Genetic Resources, Germplasm, gene pool - Copy.pptxPlant Genetic Resources, Germplasm, gene pool - Copy.pptx
Plant Genetic Resources, Germplasm, gene pool - Copy.pptxAKSHAYMAGAR17
 
HOW TO DEVELOP A RESEARCH PROPOSAL (FOR RESEARCH SCHOLARS)
HOW TO DEVELOP A RESEARCH PROPOSAL (FOR RESEARCH SCHOLARS)HOW TO DEVELOP A RESEARCH PROPOSAL (FOR RESEARCH SCHOLARS)
HOW TO DEVELOP A RESEARCH PROPOSAL (FOR RESEARCH SCHOLARS)Rabiya Husain
 
BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...
BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...
BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...Nguyen Thanh Tu Collection
 
VPEC BROUCHER FOR ALL COURSES UPDATED FEB 2024
VPEC BROUCHER FOR ALL COURSES UPDATED FEB 2024VPEC BROUCHER FOR ALL COURSES UPDATED FEB 2024
VPEC BROUCHER FOR ALL COURSES UPDATED FEB 2024avesmalik2
 
Nzinga Kika - The story of the queen
Nzinga Kika    -  The story of the queenNzinga Kika    -  The story of the queen
Nzinga Kika - The story of the queenDeanAmory1
 
11 CI SINIF SINAQLARI - 10-2023-Aynura-Hamidova.pdf
11 CI SINIF SINAQLARI - 10-2023-Aynura-Hamidova.pdf11 CI SINIF SINAQLARI - 10-2023-Aynura-Hamidova.pdf
11 CI SINIF SINAQLARI - 10-2023-Aynura-Hamidova.pdfAynouraHamidova
 
11 CI SINIF SINAQLARI - 1-2023-Aynura-Hamidova.pdf
11 CI SINIF SINAQLARI - 1-2023-Aynura-Hamidova.pdf11 CI SINIF SINAQLARI - 1-2023-Aynura-Hamidova.pdf
11 CI SINIF SINAQLARI - 1-2023-Aynura-Hamidova.pdfAynouraHamidova
 
BTKi in Treatment Of Chronic Lymphocytic Leukemia
BTKi in Treatment Of Chronic Lymphocytic LeukemiaBTKi in Treatment Of Chronic Lymphocytic Leukemia
BTKi in Treatment Of Chronic Lymphocytic LeukemiaFaheema Hasan
 
BÀI TẬP BỔ TRỢ 4 KĨ NĂNG TIẾNG ANH LỚP 8 - HK2 - GLOBAL SUCCESS - NĂM HỌC 202...
BÀI TẬP BỔ TRỢ 4 KĨ NĂNG TIẾNG ANH LỚP 8 - HK2 - GLOBAL SUCCESS - NĂM HỌC 202...BÀI TẬP BỔ TRỢ 4 KĨ NĂNG TIẾNG ANH LỚP 8 - HK2 - GLOBAL SUCCESS - NĂM HỌC 202...
BÀI TẬP BỔ TRỢ 4 KĨ NĂNG TIẾNG ANH LỚP 8 - HK2 - GLOBAL SUCCESS - NĂM HỌC 202...Nguyen Thanh Tu Collection
 

Recently uploaded (20)

Grades 7 to 8 Anti- OSAEC and CSAEM session.pptx
Grades 7 to 8 Anti- OSAEC and CSAEM session.pptxGrades 7 to 8 Anti- OSAEC and CSAEM session.pptx
Grades 7 to 8 Anti- OSAEC and CSAEM session.pptx
 
Routes of Drug Administrations PPT..pptx
Routes of Drug Administrations PPT..pptxRoutes of Drug Administrations PPT..pptx
Routes of Drug Administrations PPT..pptx
 
Lipids as Biopolymer
Lipids as Biopolymer Lipids as Biopolymer
Lipids as Biopolymer
 
Practical Research 1: Qualitative Research and Its Importance in Daily Life.pptx
Practical Research 1: Qualitative Research and Its Importance in Daily Life.pptxPractical Research 1: Qualitative Research and Its Importance in Daily Life.pptx
Practical Research 1: Qualitative Research and Its Importance in Daily Life.pptx
 
2.20.24 The March on Washington for Jobs and Freedom.pptx
2.20.24 The March on Washington for Jobs and Freedom.pptx2.20.24 The March on Washington for Jobs and Freedom.pptx
2.20.24 The March on Washington for Jobs and Freedom.pptx
 
New Features in the Odoo 17 Sales Module
New Features in  the Odoo 17 Sales ModuleNew Features in  the Odoo 17 Sales Module
New Features in the Odoo 17 Sales Module
 
ADAPTABILITY, Types of Adaptability AND STABILITY ANALYSIS method.pptx
ADAPTABILITY, Types of Adaptability AND STABILITY ANALYSIS  method.pptxADAPTABILITY, Types of Adaptability AND STABILITY ANALYSIS  method.pptx
ADAPTABILITY, Types of Adaptability AND STABILITY ANALYSIS method.pptx
 
spring_bee_bot_creations_erd primary.pdf
spring_bee_bot_creations_erd primary.pdfspring_bee_bot_creations_erd primary.pdf
spring_bee_bot_creations_erd primary.pdf
 
Andreas Schleicher - 20 Feb 2024 - How pop music, podcasts, and Tik Tok are i...
Andreas Schleicher - 20 Feb 2024 - How pop music, podcasts, and Tik Tok are i...Andreas Schleicher - 20 Feb 2024 - How pop music, podcasts, and Tik Tok are i...
Andreas Schleicher - 20 Feb 2024 - How pop music, podcasts, and Tik Tok are i...
 
ICSE English Literature Class X Handwritten Notes
ICSE English Literature Class X Handwritten NotesICSE English Literature Class X Handwritten Notes
ICSE English Literature Class X Handwritten Notes
 
Plant Genetic Resources, Germplasm, gene pool - Copy.pptx
Plant Genetic Resources, Germplasm, gene pool - Copy.pptxPlant Genetic Resources, Germplasm, gene pool - Copy.pptx
Plant Genetic Resources, Germplasm, gene pool - Copy.pptx
 
HOW TO DEVELOP A RESEARCH PROPOSAL (FOR RESEARCH SCHOLARS)
HOW TO DEVELOP A RESEARCH PROPOSAL (FOR RESEARCH SCHOLARS)HOW TO DEVELOP A RESEARCH PROPOSAL (FOR RESEARCH SCHOLARS)
HOW TO DEVELOP A RESEARCH PROPOSAL (FOR RESEARCH SCHOLARS)
 
BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...
BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...
BÀI TẬP BỔ TRỢ TIẾNG ANH 11 THEO ĐƠN VỊ BÀI HỌC - CẢ NĂM - CÓ FILE NGHE (GLOB...
 
VPEC BROUCHER FOR ALL COURSES UPDATED FEB 2024
VPEC BROUCHER FOR ALL COURSES UPDATED FEB 2024VPEC BROUCHER FOR ALL COURSES UPDATED FEB 2024
VPEC BROUCHER FOR ALL COURSES UPDATED FEB 2024
 
Nzinga Kika - The story of the queen
Nzinga Kika    -  The story of the queenNzinga Kika    -  The story of the queen
Nzinga Kika - The story of the queen
 
11 CI SINIF SINAQLARI - 10-2023-Aynura-Hamidova.pdf
11 CI SINIF SINAQLARI - 10-2023-Aynura-Hamidova.pdf11 CI SINIF SINAQLARI - 10-2023-Aynura-Hamidova.pdf
11 CI SINIF SINAQLARI - 10-2023-Aynura-Hamidova.pdf
 
11 CI SINIF SINAQLARI - 1-2023-Aynura-Hamidova.pdf
11 CI SINIF SINAQLARI - 1-2023-Aynura-Hamidova.pdf11 CI SINIF SINAQLARI - 1-2023-Aynura-Hamidova.pdf
11 CI SINIF SINAQLARI - 1-2023-Aynura-Hamidova.pdf
 
first section physiology laboratory.pptx
first section physiology laboratory.pptxfirst section physiology laboratory.pptx
first section physiology laboratory.pptx
 
BTKi in Treatment Of Chronic Lymphocytic Leukemia
BTKi in Treatment Of Chronic Lymphocytic LeukemiaBTKi in Treatment Of Chronic Lymphocytic Leukemia
BTKi in Treatment Of Chronic Lymphocytic Leukemia
 
BÀI TẬP BỔ TRỢ 4 KĨ NĂNG TIẾNG ANH LỚP 8 - HK2 - GLOBAL SUCCESS - NĂM HỌC 202...
BÀI TẬP BỔ TRỢ 4 KĨ NĂNG TIẾNG ANH LỚP 8 - HK2 - GLOBAL SUCCESS - NĂM HỌC 202...BÀI TẬP BỔ TRỢ 4 KĨ NĂNG TIẾNG ANH LỚP 8 - HK2 - GLOBAL SUCCESS - NĂM HỌC 202...
BÀI TẬP BỔ TRỢ 4 KĨ NĂNG TIẾNG ANH LỚP 8 - HK2 - GLOBAL SUCCESS - NĂM HỌC 202...
 

19_Learning.ppt

  • 1. Learning in Neural Networks Neurons and the Brain Neural Networks Perceptrons Multi-layer Networks Applications The Hopfield Network
  • 2. Neural Networks A model of reasoning based on the human brain complex networks of simple computing elements capable of learning from examples  with appropriate learning methods collection of simple elements performs high-level operations
  • 3. Neural Networks and the Brain (Cont.)  The human brain incorporates nearly 10 billion neurons and 60 trillion connections between them.  Our brain can be considered as a highly complex, non-linear and parallel information-processing system.  Learning is a fundamental and essential characteristic of biological neural networks.
  • 4. Artificial Neuron (Perceptron) Diagram  weighted inputs are summed up by the input function  the (nonlinear) activation function calculates the activation value, which determines the output [Russell & Norvig, 1995]
  • 5. Common Activation Functions Stept(x) = 1 if x >= t, else 0 Sign(x) = +1 if x >= 0, else –1 Sigmoid(x) = 1/(1+e-x) [Russell & Norvig, 1995]
  • 6. Neural Networks and Logic Gates simple neurons can act as logic gates  appropriate choice of activation function, threshold, and weights  step function as activation function [Russell & Norvig, 1995]
  • 7. Network Structures layered structures  networks are arranged into layers  interconnections mostly between two layers  some networks may have feedback connections
  • 8. Perceptrons  single layer, feed- forward network  historically one of the first types of neural networks  late 1950s  the output is calculated as a step function applied to the weighted sum of inputs  capable of learning simple functions  linearly separable [Russell & Norvig, 1995]
  • 9. [Russell & Norvig, 1995] Perceptrons and Linear Separability perceptrons can deal with linearly separable functions some simple functions are not linearly separable  XOR function 0,0 0,1 1,0 1,1 0,0 0,1 1,0 1,1 AND XOR
  • 10. Perceptrons and Linear Separability  linear separability can be extended to more than two dimensions  more difficult to visualize [Russell & Norvig, 1995]
  • 11. How does the perceptron learn its classification tasks? This is done by making small adjustments in the weights  to reduce the difference between the actual and desired outputs of the perceptron. The initial weights are randomly assigned  usually in the range [0.5, 0.5], or [0, 1] Then the they are updated to obtain the output consistent with the training examples.
  • 12. Perceptrons and Learning perceptrons can learn from examples through a simple learning rule. For each example row (iteration), do the following:  calculate the error of a unit Erri as the difference between the correct output Ti and the calculated output Oi Erri = Ti - Oi  adjust the weight Wj of the input Ij such that the error decreases Wij = Wij +  *Iij * Errij   is the learning rate, a positive constant less than unity.  this is a gradient descent search through the weight space
  • 13. Example of perceptron learning: the logical operation AND Inputs x1 x2 0 0 1 1 0 1 0 1 0 0 0 Epoch Desired output Yd 1 Initial weights w1 w2 1 0.3 0.3 0.3 0.2 0.1 0.1 0.1 0.1 0 0 1 0 Actual output Y Error e 0 0 1 1 Final weights w1 w2 0.3 0.3 0.2 0.3 0.1 0.1 0.1 0.0 0 0 1 1 0 1 0 1 0 0 0 2 1 0.3 0.3 0.3 0.2 0 0 1 1 0 0 1 0 0.3 0.3 0.2 0.2 0.0 0.0 0.0 0.0 0 0 1 1 0 1 0 1 0 0 0 3 1 0.2 0.2 0.2 0.1 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0 0 1 0 0 0 1 1 0.2 0.2 0.1 0.2 0.0 0.0 0.0 0.1 0 0 1 1 0 1 0 1 0 0 0 4 1 0.2 0.2 0.2 0.1 0.1 0.1 0.1 0.1 0 0 1 1 0 0 1 0 0.2 0.2 0.1 0.1 0.1 0.1 0.1 0.1 0 0 1 1 0 1 0 1 0 0 0 5 1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0 0 0 1 0 0 0 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0 Threshold:  = 0.2; learning rate:  = 0.1
  • 14. Two-dimensional plots of basic logical operations x1 x2 1 (a) AND (x1  x2) 1 x1 x2 1 1 (b) OR (x1  x2) x1 x2 1 1 (c) Exclusive-OR (x1  x2) 0 0 0 A perceptron can learn the operations AND and OR, but not Exclusive-OR.
  • 15. Multi-Layer Neural Networks The network consists of an input layer of source neurons, at least one middle or hidden layer of computational neurons, and an output layer of computational neurons. The input signals are propagated in a forward direction on a layer-by-layer basis  feedforward neural network the back-propagation learning algorithm can be used for learning in multi-layer networks
  • 16. Diagram Multi-Layer Network two-layer network  input units Ik  usually not counted as a separate layer  hidden units aj  output units Oi usually all nodes of one layer have weighted connections to all nodes of the next layer Ik aj Oi Wji Wkj
  • 18. Back-Propagation Algorithm Learning in a multilayer network proceeds the same way as for a perceptron. A training set of input patterns is presented to the network. The network computes its output pattern, and if there is an error  or in other words a difference between actual and desired output patterns  the weights are adjusted to reduce this error.  proceeds from the output layer to the hidden layer(s)  updates the weights of the units leading to the layer
  • 19. Back-Propagation Algorithm In a back-propagation neural network, the learning algorithm has two phases. First, a training input pattern is presented to the network input layer. The network propagates the input pattern from layer to layer until the output pattern is generated by the output layer. If this pattern is different from the desired output, an error is calculated and then propagated backwards through the network from the output layer to the input layer. The weights are modified as the error is propagated.
  • 20. Three-layer Feed-Forward Neural Network ( trained using back-propagation algorithm) Input layer xi x1 x2 xn 1 2 i n Output layer 1 2 k l yk y1 y2 yl Input signals Error signals wjk Hidden layer wij 1 2 j m
  • 21. Three-layer network for solving the Exclusive- OR operation y5 5 x1 3 1 x2 Input layer Output layer Hidden layer 4 2 3 w13 w24 w23 w24 w35 w45 4 5 1 1 1
  • 22. Final results of three-layer network learning Inputs x1 x2 1 0 1 0 1 1 0 0 0 1 1 Desired output yd 0 0.0155 Actual output y5 Y Error e Sum of squared errors e 0.9849 0.9849 0.0175 0.0155 0.0151 0.0151 0.0175 0.0010
  • 23. Network for solving the Exclusive-OR operation y5 5 x1 3 1 x2 4 2 +1.0 1 1 1 +1.0 +1.0 +1.0 +1.5 +1.0 +1.0 +0.5 +0.5
  • 24. (a) Decision boundary constructed by hidden neuron 3; (b) Decision boundary constructed by hidden neuron 4; (c) Decision boundaries constructed by the complete three-layer network x1 x2 1 (a) 1 x2 1 1 (b) 0 0 x1 + x2 – 1.5 = 0 x1 + x2 – 0.5 = 0 x1 x1 x2 1 1 (c) 0 Decision boundaries
  • 25. Capabilities of Multi-Layer Neural Networks expressiveness  weaker than predicate logic  good for continuous inputs and outputs computational efficiency  training time can be exponential in the number of inputs  depends critically on parameters like the learning rate  local minima are problematic  can be overcome by simulated annealing, at additional cost generalization  works reasonably well for some functions (classes of problems)  no formal characterization of these functions
  • 26. Capabilities of Multi-Layer Neural Networks (cont.) sensitivity to noise  very tolerant  they perform nonlinear regression transparency  neural networks are essentially black boxes  there is no explanation or trace for a particular answer  tools for the analysis of networks are very limited  some limited methods to extract rules from networks prior knowledge  very difficult to integrate since the internal representation of the networks is not easily accessible
  • 27. Applications domains and tasks where neural networks are successfully used  recognition  control problems  series prediction  weather, financial forecasting  categorization  sorting of items (fruit, characters, …)
  • 28.  Neural networks were designed on analogy with the brain.  The brain’s memory, however, works by association.  For example, we can recognise a familiar face even in an unfamiliar environment within 100-200 ms.  We can also recall a complete sensory experience, including sounds and scenes, when we hear only a few bars of music.  The brain routinely associates one thing with another. The Hopfield Network
  • 29.  Multilayer neural networks trained with the back- propagation algorithm are used for pattern recognition problems.  However, to emulate the human memory’s associative characteristics we need a different type of network: a recurrent neural network.  A recurrent neural network has feedback loops from its outputs to its inputs.
  • 30. Single-layer n-neuron Hopfield network xi x1 x2 xn I n p u t S i g n a l s yi y1 y2 yn 1 2 i n O u t p u t S i g n a l s  The stability of recurrent networks was solved only in 1982, when John Hopfield formulated the physical principle of storing information in a dynamically stable network.
  • 31. Chapter Summary learning is very important for agents to improve their decision-making process  unknown environments, changes, time constraints most methods rely on inductive learning  a function is approximated from sample input-output pairs  neural networks consist of simple interconnected computational elements multi-layer feed-forward networks can learn any function  provided they have enough units and time to learn