4. مقدمه
•چراااا؟
•“The next step for Deep Learning is natural language understanding…”
•Yann LeCun (Facebook AI Director, Professor in university of New-York)
•“I think that the most exciting areas over the next five years will be really
understanding text and videos…”
•Geoffry Hinton (Google researcher, Professor in university of Toronto)
•Yoshua Bengio
•Michael Jordan
[D.Manning, 2015]
40/4
15. زبانی مدل
P (Dog | I saw a…) // general
P (Dog| a) // bigram
P (Dog| a) = #count(an Dog) /
# count(an)
مشکالت؟؟؟
1) The cat is walking in the bedroom.
2) A dog is running in a room.
40/13
50. منابع
[1] Y. LeCun, Y. Bengio, and G. Hinton “Deep learning.” Nature, 2015.
[2] R. Socher “Deep learning for Natural Language Processing.”. 2016. Presentation.
[3] C. D. Manning “Last Words.” ACL, 2015.
[4] L. Hank, E. McDermott, and A. Senior. "Large scale deep neural network acoustic modeling
with semi-supervised training data for YouTube video transcription." ASRU, 2013.
[5] L. Thang, R. Socher, and C. D. Manning. "Better word representations with recursive neural
networks for morphology." CoNLL, 2013.
[6] S. Bowman, C. Potts, and C. D. Manning. "Recursive neural networks can learn logical
semantics." arXiv, 2014.
[7] O. Levy, Y. Goldberg, “Neural word embedding as implicit Matrix factorization”, NIPS, 2014
40/39
51. منابع
[8] T. Mikolov, et al. "Efficient estimation of word representations in vector space." arXiv, 2013.
[9] T. Mikolov, et al. "Distributed representations of words and phrases and their
compositionality." NIPS, 2013.
[10] J. Pennington, et al. "Glove: Global Vectors for Word Representation." EMNLP, 2014.
[11] R. Kiros, et al. "Skip-thought vectors." NIPS, 2015.
[12] Y. Wu, et.al, “Google ’ s Neural Machine Translation System : Bridging the Gap between
Human and Machine Translation,” arXiv, 2016.
[13] Q. Lee, T.Mikolov, et.al, “Distributed Representations of Sentences and Documents,” arXiv,
2014.
[14] O. Levy, Y. Goldberg “Dependency-based word embeddings" ACL, 2014.
40/40