This document outlines advances in deep learning and neural networks. It discusses challenges in machine learning like feature extraction. It describes how neuroscience experiments showed the brain's ability to learn new tasks. Neural networks aim to mimic the brain through techniques like backpropagation to train multi-layer models. Breakthroughs like pre-training and convolutional networks helped scale networks to many layers. Deep learning is now used in speech translation, image recognition, handwriting recognition and more.
9. Neuroscience Experiment, (1992)
Auditory cortex learns to see!
Roe, Anna W., et al. "Visual projections routed to the auditory pathway in ferrets: receptive fields of
visual neurons in primary auditory cortex." The Journal of neuroscience 12.9 (1992): 3651-3664.
10. Seeing With Tongue
Blind people can see using tongue
http://www.wicab.com/en_us/press.html
12. We want:
Automatic feature learning
Training data
Unlabeled: whatever, we have a lot!
Labeled: small!
13. Mimicking Brain: Neural Networks
Perceptron: one-layer NN
Parameters : w, not known, training
Activation function: f(x) =f(wi xi +w0)
14. Training One Layer Network
Training data: input x, output y
• Reference: Deep Learning and Neural Networks, by Kevin Duh,
http://cl.naist.jp/~kevinduh/a/deep2014/
15. Training: Gradient Descent
• Reference: Deep Learning and Neural Networks, by Kevin Duh,
http://cl.naist.jp/~kevinduh/a/deep2014/
16. Two-Layer Network
• Reference: Deep Learning and Neural Networks, by Kevin Duh,
http://cl.naist.jp/~kevinduh/a/deep2014/
19. Multi-Layer Network: Dark Ages
• Reference: Deep Learning and Neural Networks, by Kevin Duh,
http://cl.naist.jp/~kevinduh/a/deep2014/
20. Breakthrough, [Hinton, et al., 2006]
Layer-Wise Pre-Training, unsupervised
Optimize likelihood of data, P(x)
• Reference: Deep Learning and Neural Networks, by Kevin Duh,
http://cl.naist.jp/~kevinduh/a/deep2014/
21. Breakthrough, cnt.
Fine-tune using labeled data, supervised
• Reference: Deep Learning and Neural Networks, by Kevin
Duh, http://cl.naist.jp/~kevinduh/a/deep2014/
23. Deep Learning Approaches
Stacked Autoencoders
Autoencoders: learns to reconstruct input data
Easier to train
• Reference: Deep Learning tutorial,Andrew Ng
25. Extracted Features
Convolutional Deep Belief Networks for Scalable Unsupervised Learning of Hierarchical Representations[Lee et
al., 2009]
26. Deep Learning: advances
Microsoft real-time speech translation
https://www.youtube.com/watch?v=NhxCg2PA3ZI
27. Deep Learning: advances
Google artificial brain learns to find cat and face
NN, 1 billion connection, 16000 computers, browseYouTube
for 3 days
28. Others
Google+ Image Search, no-tag image search
Handwriting recognition
Android speech to text
Medical Diagnosis
29. Summary
• Reference: Deep Learning and Neural Networks, by Kevin Duh,
http://cl.naist.jp/~kevinduh/a/deep2014/