Nn 1light


Published on

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • NN 1 Elena Marchiori
  • NN 1 Elena Marchiori
  • NN 1 10-00 Elene Marchiori
  • NN 1 10-00 Elene Marchiori
  • Nn 1light

    1. 1. Neural Networks NN 1 1Neural NetworksTeacher:ElenaMarchioriR4.47elena@cs.vu.nlAssistant:Marius CodreaS4.16mcodrea@few.vu.nl
    2. 2. Neural Networks NN 1 2Course OutlineThe course is divided in two parts: theory andpractice.1. Theory covers basic topics in neuralnetworks theory and application tosupervised and unsupervised learning.2. Practice deals with basics of Matlab andapplication of NN learning algorithms.
    3. 3. Neural Networks NN 1 3Course Information• Register for practicum: send email tomcodrea@few.vu.nl with:1. Subject: NN practicum2. Content: names, study numbers, study directions(AI,BWI,I, other)• Course information, plan, slides and links toon-line material are available athttp://www.cs.vu.nl/~elena/nn.html
    4. 4. Neural Networks NN 1 4Course Evaluation• Course value: 6 ects• Evaluation is based on the following twoparts:– theory (weight 0.5): final exam at the end of thecourse consisting of questions about theory part.(Dates to be announced)– practicum(weight 0.5): Matlab programmingassignments to be done in couples (Availableduring the course athttp://www.few.vu.nl/~codrea/nn)
    5. 5. Neural Networks NN 1 5What are Neural Networks?• Simple computational elements forming alarge network– Emphasis on learning (pattern recognition)– Local computation (neurons)• Definition of NNs is vague– Often | but not always | inspired by biological brain
    6. 6. Neural Networks NN 1 6History• Roots of work on NN are in:• Neurobiological studies (more than one century ago):• How do nerves behave when stimulated by different magnitudesof electric current? Is there a minimal threshold needed fornerves to be activated? Given that no single nerve cel is longenough, how do different nerve cells communicate among eachother?• Psychological studies:• How do animals learn, forget, recognize and perform other typesof tasks?• Psycho-physical experiments helped to understand how individualneurons and groups of neurons work.• McCulloch and Pitts introduced the first mathematical model ofsingle neuron, widely applied in subsequent work.
    7. 7. Neural Networks NN 1 7HistoryPrehistory:• Golgi and Ramon y Cajal study the nervous system and discoverneurons (end of 19th century)History (brief):• McCulloch and Pitts (1943): the first artificial neural network withbinary neurons• Hebb (1949): learning = neurons that are together wire together• Minsky (1954): neural networks for reinforcement learning• Taylor (1956): associative memory• Rosenblatt (1958): perceptron, a single neuron for supervisedlearning
    8. 8. Neural Networks NN 1 8History• Widrow and Hoff (1960): Adaline• Minsky and Papert (1969): limitations of single-layer perceptrons (andthey erroneously claimed that the limitations hold for multi-layerperceptrons)Stagnation in the 70s:• Individual researchers continue laying foundations• von der Marlsburg (1973): competitive learning and self-organizationBig neural-nets boom in the 80s• Grossberg: adaptive resonance theory (ART)• Hopfield: Hopfield network• Kohonen: self-organising map (SOM)
    9. 9. Neural Networks NN 1 9History• Oja: neural principal component analysis (PCA)• Ackley, Hinton and Sejnowski: Boltzmann machine• Rumelhart, Hinton and Williams: backpropagationDiversification during the 90s:• Machine learning: mathematical rigor, Bayesian methods,infomation theory, support vector machines (now state of theart!), ...• Computational neurosciences: workings of most subsystems of thebrain are understood at some level; research ranges from low-levelcompartmental models of individual neurons to large-scale brainmodels
    10. 10. Neural Networks NN 1 10Course TopicsLearning TasksSupervised UnsupervisedData:Labeled examples(input , desired output)Tasks:classificationpattern recognitionregressionNN models:perceptronadalinefeed-forward NNradial basis functionsupport vector machinesData:Unlabeled examples(different realizations of theinput)Tasks:clusteringcontent addressable memoryNN models:self-organizing maps (SOM)Hopfield networks
    11. 11. Neural Networks NN 1 11NNs: goal and design– Knowledge about the learning task is given in theform of a set of examples (dataset) called trainingexamples.– A NN is specified by:• an architecture: a set of neurons and links connectingneurons. Each link has a weight,• a neuron model: the information processing unit of theNN,• a learning algorithm: used for training the NN bymodifying the weights in order to solve the particularlearning task correctly on the training examples.The aim is to obtain a NN that generalizes well, thatis, that behaves correctly on new examples of thelearning task.
    12. 12. Neural Networks NN 1 12Example: AlvinnAutonomousdriving at 70 mphon a publichighwayCameraimage30x32 pixelsas inputs30 outputsfor steering 30x32 weightsinto one out offour hiddenunit4 hiddenunits
    13. 13. Neural Networks NN 1 13Face Recognition90% accurate learning head pose, and recognizing 1-of-20 faces
    14. 14. Neural Networks NN 1 14Handwritten digit recognition
    15. 15. Neural Networks NN 1 15Dimensions of a NeuralNetwork• network architectures• types of neurons• learning algorithms• applications
    16. 16. Neural Networks NN 1 16Network architectures• Three different classes of network architectures– single-layer feed-forward neurons are organized– multi-layer feed-forward in acyclic layers– recurrent• The architecture of a neural network is linked with thelearning algorithm used to train
    17. 17. Neural Networks NN 1 17Single Layer Feed-forwardInput layerofsource nodesOutput layerofneurons
    18. 18. Neural Networks NN 1 18Multi layer feed-forwardInputlayerOutputlayerHidden Layer3-4-2 Network
    19. 19. Neural Networks NN 1 19Recurrent Network with hidden neuron: unit delayoperator z-1is used to model a dynamic systemz-1z-1z-1Recurrent networkinputhiddenoutput
    20. 20. Neural Networks NN 1 20The NeuronInputvaluesweightsSummingfunctionBiasbActivationfunctionLocalFieldvOutputyx1x2xmw2wmw1∑ )(−ϕ………….
    21. 21. Neural Networks NN 1 21Input Signal and WeightsInput signalsAn input may be either araw / preprocessed signal orimage. Alternatively, somespecific features can also beused.If specific features are usedas input, their number andselection is crucial andapplication dependentWeightsWeights are connectedbetween an input and asumming node. These affect tothe summing operation.The quality of network can beseen from weightsBias is a constant input withcertain weight.Usually the weights arerandomized in the beginning
    22. 22. Neural Networks NN 1 22The Neuron• The neuron is the basic information processing unit ofa NN. It consists of:1 A set of links, describing the neuron inputs, withweights W1, W2, …, Wm2 An adder function (linear combiner) for computingthe weighted sum ofthe inputs (real numbers):3 Activation function (squashing function) forlimiting the amplitude of the neuron output.∑==m1jjxwujϕ)(uy b+= ϕ
    23. 23. Neural Networks NN 1 23Bias of a Neuron• The bias b has the effect of applying an affinetransformation to the weighted sum uv = u + b• v is called induced field of the neuronx2x1u −=x1-x2=0x1-x2= 1x1x2x1-x2= -1
    24. 24. Neural Networks NN 1 24Bias as extra inputInputsignalSynapticweightsSummingfunctionActivationfunctionLocalFieldvOutputyx1x2xmw2wmw1∑ )(−ϕw0x0 = +1• The bias is an external parameter of the neuron. It can bemodeled by adding an extra input.bwxwv jmjj== ∑=00…………..
    25. 25. Neural Networks NN 1 25Activation FunctionThere are different activation functions used in different applications. Themost common ones are:Hard-limiter Piecewise linear Sigmoid Hyperbolic tangent( )<≥=0001vifvifvϕ ( )−≤−≥≥≥=2102121211vifvifvvifvϕ ( ))exp(11avv−+=ϕ( ) ( )vv tanh=ϕϕ
    26. 26. Neural Networks NN 1 26Neuron Models• The choice of determines the neuron model. Examples:• step function:• ramp function:• sigmoid function:with z,x,y parameters• Gaussian function:ϕ −−=221exp21)(σµσπϕvv)exp(11)(yxvzv+−++=ϕ−−−+><=otherwise))/())(((ifif)(cdabcvadvbcvavϕ><=cvbcvavifif)(ϕ
    27. 27. Neural Networks NN 1 27Learning AlgorithmsDepend on the network architecture:• Error correcting learning (perceptron)• Delta rule (AdaLine, Backprop)• Competitive Learning (Self Organizing Maps)
    28. 28. Neural Networks NN 1 28Applications• Classification:– Image recognition– Speech recognition– Diagnostic– Fraud detection– …• Regression:– Forecasting (prediction on base of past history)– …• Pattern association:– Retrieve an image from corrupted one– …• Clustering:– clients profiles– disease subtypes– …
    29. 29. Neural Networks NN 1 29Supervised learningNon-linear classifiersLinear classifiersPerceptron AdalineFeed-forward networksRadial basis function networksSupport vector machinesUnsupervised learningClusteringContent addressable memoriesOptimization Hopfield networksSelf-organizing maps K-means
    30. 30. Neural Networks NN 1 30Vectors: basics(slides by Hyoungjune Yi: aster@cs.umd.edu)• Ordered set ofnumbers: (1,2,3,4)• Example: (x,y,z)coordinates of pt inspace. runit vectoais,1If12),,2,1(vvniixvnxxxv====∑
    31. 31. Neural Networks NN 1 31Vector Addition),(),(),( 22112121 yxyxyyxx ++=+=+ wvvvwwV+wV+w
    32. 32. Neural Networks NN 1 32Scalar Product),(),( 2121 axaxxxaa ==vvvavav
    33. 33. Neural Networks NN 1 33Operations on vectors• sum• max, min, mean, sort, …• Pointwise: .^
    34. 34. Neural Networks NN 1 34Inner (dot) Productvvwwαα22112121 .),).(,(. yxyxyyxxwv +==The inner product is aThe inner product is a SCALAR!SCALAR!αcos||||||||),).(,(. 2121 wvyyxxwv ⋅==wvwv ⊥⇔= 0.
    35. 35. Neural Networks NN 1 35Matrices=×nmnnmmmmnaaaaaaaaaaaaA21332312222111211mnmnmn BAC ××× +=Sum:Sum:ijijij bac +=A and B must have the sameA and B must have the samedimensionsdimensions
    36. 36. Neural Networks NN 1 36Matricespmmnpn BAC ××× =Product:Product:∑==mkkjikij bac1A and B must haveA and B must havecompatible dimensionscompatible dimensionsnnnnnnnn ABBA ×××× ≠Identity Matrix:AAIIAI ===100010001
    37. 37. Neural Networks NN 1 37MatricesmnTnm AC ×× =Transpose:Transpose:jiij ac = TTTABAB =)(TTTBABA +=+ )(IfIf AAT= A is symmetricA is symmetric
    38. 38. Neural Networks NN 1 38MatricesIAAAA nnnnnnnn == ××−×−×11Inverse:Inverse: A must be squareA must be square−−−=−1121122212212211122211211 1aaaaaaaaaaaa
    39. 39. Neural Networks NN 1 392D TranslationttPPP’P’
    40. 40. Neural Networks NN 1 402D Translation EquationPPxxyyttxxttyyP’P’tttPP +=++= ),( yx tytx),(),(yx ttyx==tP
    41. 41. Neural Networks NN 1 412D Translation using MatricesPPxxyyttxxttyyP’P’tt),(),(yx ttyx==tP⋅=++→11001 yxtttytxyxyxPtt PP
    42. 42. Neural Networks NN 1 42ScalingPPP’P’
    43. 43. Neural Networks NN 1 43Scaling EquationPPxxyys.xs.xP’P’s.ys.y),(),(sysxyx==PPPP ⋅= s⋅=→yxsssysx00PSPSP ⋅=