235. FG’04)
Face and Facial Point Tracking Facial Expression Classification
€
πk = φ d q sk( ),hj( ),σp( )j
∑
πk: Parzen density estimation
sk: particle
{hj}: collection of the training data
q(.): transformation function for registration
d(.): distance function
φ(.): Parzen kernel (Gaussian kernel with
standard deviation σp)
254. 2012)
Face and Facial Point Tracking Facial Expression Classificationy-
th
we
sis
p-
of
(c)
on
b’s
is the part of the face occluded by the scarf. Fig. 2 (c)
plots the distribution of ∆φ in P2. Before proceeding for
(a) (b)
0 1 2 3 4 5 6.2832
0
5
10
15
20
25
∆Φ (radius)
(c)
Fig. 2. (a)-(b) An image pair used in our experiments. (c)
The distribution of ∆φ for the part of the face occluded by
the scarf.
computing the value of s in P2, we need the following
theorem [22] (please refer to Section I of supplementary
material for a proof).
Theorem I Let u(.) be a random process and u(t) ∼
U[0, 2π) then:
• E[ X
cos u(t)dt] = 0 for any non-empty interval X ∈
R.
(a) (b)
0 0.1 0.2
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Correlation
Fig. 3. Cosine-based correlation of im
tation for the case of synthetic occlu
of images considered for our experim
and expected values of s/N(P) as
percentage of occlusion N(P2)/N(P)
288. 2013)
Face and Facial Point Tracking Facial Expression Classification
✤ Annotate a set of facial landmarks in training images (set T)
✤ Learn a Shape Model that represents the variations in T
✤ Learn an Appearance Model from image textures defined by T
PCA$
PCA$
✤ AAM Fitting:
Placemodel
inimage
Measure
similarity
Update
model
312. 2014)
HoG features convolve with HoG filters
train a weak learner U for each point
fit update a set of weak learners U
Face and Facial Point Tracking Facial Expression Classification
324. 2014)
Face and Facial Point Tracking Facial Expression Classification
Results for Multi-PIE Results for LFPW
✤ C/CUDA implementation: 30 fps (ibug.doc.ic.ac.uk/resources)