Neural Networks


Published on

Published in: Technology, Education
  • more nice sorry<br /><br/>
    Are you sure you want to  Yes  No
    Your message goes here
  • moe nice slides

    thanx<br /><br/>
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Neural Networks

  1. 1. NEURAL NETWORKS & Machine Learning Justin Chow Levon Mkrtchyan Eric Su Senior Project 5/16/07
  2. 2. What are Neural Nets? <ul><ul><li>Refers to a computing paradigm that is modeled after the structure of the brain. </li></ul></ul><ul><ul><li>Inspired by examination of the central nervous system and the neurons </li></ul></ul><ul><ul><ul><li>In Neuroscience, refers to physically collected neurons in our brains. </li></ul></ul></ul><ul><ul><li>Is a network because the function f(x) executed by a node is a composition of other functions, which are in turn defined as compositions of other function. </li></ul></ul>
  3. 3. What is Machine Learning? <ul><ul><li>Learning – Given a task to solve and a class of functions F, learning means using a set of observations to find an optimal solution that is an element of F </li></ul></ul><ul><ul><li>Requires a cost function to determine how close we are to the optimal solution. </li></ul></ul><ul><ul><ul><li>Learning Paradigms </li></ul></ul></ul><ul><ul><ul><ul><li>Supervised learning </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Unsupervised learning </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Reinforcement learning </li></ul></ul></ul></ul><ul><ul><li>Training employs many cutting edge mathematical theories </li></ul></ul><ul><ul><li>The NN has a learning algorithm, which you train with thousands of examples. </li></ul></ul>
  4. 4. Relations to A.I. <ul><ul><li>Marvin Minsky, one of the founding fathers of A.I., built first neural network learning machine and wrote Perceptrons , foundational work of artificial neural networks. </li></ul></ul><ul><ul><ul><ul><ul><li>- Neural network is one of the main methods for developing </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>computational intelligence. They often have very strong pattern </li></ul></ul></ul></ul></ul><ul><ul><ul><li>recognition capabilities. </li></ul></ul></ul>
  5. 5. What is currently being done? <ul><ul><ul><li>IBM is funding a four-year program called “systems neurocomputing” </li></ul></ul></ul><ul><ul><ul><ul><li>Developing neural networks to recognize patterns and avoid the “superposition catastrophe”. Is now using this research to recreate a person’s ability to perceive a broken line. </li></ul></ul></ul></ul><ul><ul><ul><li>Aston Martin, Daimler Chrysler, and other car companies are developing ANN models to detect cylinder misfires in engines. </li></ul></ul></ul><ul><ul><ul><li>Georgia Tech introduced a neural network that combines living and robotic elements. </li></ul></ul></ul><ul><ul><ul><ul><li>uses neural networks of cultured rodent brain cells and robotic body </li></ul></ul></ul></ul><ul><ul><ul><li>Recent advances in VLSI circuits, optical computing, fuzzy logic, and protein-based computing have moved the field closer to realizing massively parallel hardware. </li></ul></ul></ul>
  6. 6. Learning <ul><li>Associative mapping – network learns to produce a particular pattern on the set of input units whenever another particular pattern is applied on the set of input units. </li></ul><ul><li>Regularity detection - units learn to respond to particular properties of the input patterns. </li></ul>
  7. 7. How do they work? <ul><li>The neuron </li></ul><ul><ul><li>Model biological neurons </li></ul></ul><ul><ul><li>many inputs, one output </li></ul></ul><ul><ul><li>have weights, a bias, and a threshold (activation) function </li></ul></ul><ul><li>The network architecture </li></ul><ul><ul><li>Three interconnected layers </li></ul></ul><ul><ul><li>Input layer partitions </li></ul></ul><ul><ul><li>processing (hidden) layer analyzes </li></ul></ul><ul><ul><li>output layer … outputs </li></ul></ul><ul><ul><li>programmer uses previous knowledge to ease training </li></ul></ul>
  8. 8. How do they work? <ul><li>Tissues </li></ul><ul><ul><li>networks like neural tissue </li></ul></ul><ul><ul><li>output may be input to another network </li></ul></ul><ul><ul><li>hidden layer may consist of a number of such tissues </li></ul></ul><ul><li>Training </li></ul><ul><ul><li>weight adjustments </li></ul></ul><ul><ul><li>recognizing key part of input </li></ul></ul><ul><ul><li>hard to see what the network “learned” </li></ul></ul>
  9. 9. How do they work? <ul><li>Firing rule – determines how one calculates whether a neuron should fire for an input </li></ul><ul><ul><li>Ex: take a collection of data, some which causes firing and some which don’t. If new data is inputted, elements most in common with firing data will then cause firing. </li></ul></ul>
  10. 10. How do they work? <ul><li>Feed forward architecture </li></ul><ul><ul><li>Signals traveling one way, from input to output (associates input with output) </li></ul></ul><ul><li>Feed backward architecture </li></ul><ul><li>- Signals can travel both ways with loops. The state continually changes until equilibrium is reached </li></ul>
  11. 11. Successes <ul><li>Strengths of neural networks: </li></ul><ul><li>Pattern recognition </li></ul><ul><li>Unclear algorithm </li></ul><ul><li>No existing algorithm </li></ul><ul><li>Large amounts of test data </li></ul>
  12. 12. Successes <ul><li>20q </li></ul><ul><ul><li>Based on a word game </li></ul></ul><ul><ul><li>Learns from users </li></ul></ul><ul><ul><li>Correct 80% of the time </li></ul></ul><ul><li>Image recognition </li></ul><ul><ul><li>Recognizing objects </li></ul></ul><ul><ul><li>Categorizing images </li></ul></ul><ul><ul><li>Rendering images searchable </li></ul></ul>
  13. 13. Successes <ul><li>Signature analysis </li></ul><ul><ul><li>First large-scale use in US </li></ul></ul><ul><ul><li>Compares with stored signatures </li></ul></ul><ul><ul><li>97% accuracy over old 83% </li></ul></ul><ul><ul><li>old four-way classification more difficult </li></ul></ul><ul><li>Face recognition </li></ul><ul><ul><li>seeking to distinguish people </li></ul></ul><ul><ul><li>takes 100-200 of training pictures per person </li></ul></ul><ul><ul><li>average recognition rate of over 95% </li></ul></ul><ul><ul><li>more training does not guarantee better recognition </li></ul></ul>
  14. 14. Current Implementation <ul><li>Instant physician </li></ul><ul><ul><li>Developed in 1980s, trained a NN to store a large amount of medical records. After training, could be presented with symptoms, and could then present the best diagnosis </li></ul></ul>
  15. 15. Current implementations <ul><li>Business </li></ul><ul><ul><li>Marketing control of seat allocation on an airplane using feed-forward mechanism </li></ul></ul><ul><ul><li>Credit models and mortgage screening boosted profitability of HNC by 27% </li></ul></ul><ul><li>Medicine </li></ul><ul><ul><li>NN used to model cardiovascular system. Build a NN of a patient and compare to actual patient. Can detect medical conditions before they happen. </li></ul></ul>
  16. 16. Current implementations <ul><li>User interfaces </li></ul><ul><ul><li>Handwriting analysis tools, text-to-speech conversion (IBM, Babel) </li></ul></ul><ul><li>Industrial processes </li></ul><ul><ul><li>control machinery, adjust temperature settings, and diagnose malfunctions in robotic factories (Alyuda Research Factory) </li></ul></ul>
  17. 17. Problems Encountered <ul><li>Applications using neural networks have little or no data available for training on fault conditions, so fuzzy logic is used, based on an expert’s definition of certain rules. </li></ul><ul><li>“ Neural network programs sometimes become unstable when applied to larger problems.” </li></ul><ul><ul><li>the larger the problem, the more neural networks must draw information from to obtain a solution, making neural networks very problem specific. </li></ul></ul>
  18. 18. Problems Encountered <ul><li>Larger datasets require more extensive training time to reach a predictive solution, and there is the possibility of overtraining, in which there was low training error but high actual testing error. Unknown data necessary for the solution will also cause a high error rate, sometimes by affecting the weighted values used in determining a solution. </li></ul>
  19. 19. Future <ul><li>Simple systems which have learned to recognize simple entities (e.g. walls looming, or simple commands like Go, or Stop) may have neural network chips implanted to help in decision-making. Japanese are already using fuzzy logic for this purpose. </li></ul><ul><li>Use of neural networks to put labels on what is determined to be in the pictures, for use in medical searches </li></ul><ul><li>User-specific systems for education and entertainment based on readings taken of the user. </li></ul>
  20. 20. Future <ul><li>Development of integration of man and machine, such as with retinal and cochlear implants </li></ul><ul><li>Generally the development of use of neural networks in more everyday and diverse applications, such as in retail and manufacturing, to help make accurate decisions. </li></ul>
  21. 21. Questions?