This document outlines topics related to computational linguistics and neural networks, including:
1) It discusses machine learning concepts like training data, models, and feedback in machine learning.
2) It then covers neural networks, including how artificial neurons work and how they can be used for tasks like binary classification.
3) The document concludes by discussing how neural language models like word2vec represent words as vectors in a semantic space to model relationships between words.
25. Training
Neural
Networks
Training
Data
Neural
Networks
Output
Answer
Ini(aliza(on
Forward
Propaga(on
Error
Func(on
Backward
Propaga(on
26. Ini(aliza(on
• Randomly
sampling
W
from
–N
~
N
x
y
n11
n12
n21
n22
W12,y
W12,x
b
W11,y
W11,b
W12,b
b
W11,x
W21,11
W22,12
W21,12
W22,11
W21,b
W22,b
z1
z2
32. Distribu(on
Seman(cs
• The
meaning
of
a
word
can
be
inferred
from
its
context.
The meanings of dog
and cat are similar.
The
dog
run.
A
cat
run.
A
dog
sleep.
The
cat
sleep.
A
dog
bark.
The
cat
meows.
33. Seman(c
Vectors
The
dog
run.
A
cat
run.
A
dog
sleep.
The
cat
sleep.
A
dog
bark.
The
cat
meows.
the
a
run
sleep
bark
meow
dog
1
2
2
2
1
0
cat
2
1
2
2
0
1
35. Cosine
Similarity
• Cosine
Similarity
between
A
&
B
is:
A · B
|A||B|
dog
(a1,a2,...,an)
cat
(b1,b2,...,bn)
Cosine
similarity
between
dog
&
cat
is:
a1b1 + a2b2 + ... + anbn
p
a2
1 + a2
2 + ... + a2
n
p
b2
1 + b2
2 + ... + b2
n
36. Opera(on
of
Vectors
Woman
+
King
-‐
Man
=
Queen
Woman
Queen
Man
King
King
-‐
Man
King
-‐
Man