3. Computing eigenfaces Results References
Eigenface approach
In mathematical terms, we wish to find the principal components
of the distribution of faces or the eigenvectors of the covariance
matrix of the set of face images, treating an image as in a very
high dimension space. In our case, the dimension being R361.
4. Computing eigenfaces Results References
Eigenvectors and eigenfaces I
Visualizing an eigenface
The eigenvectors are ordered, each one accounting for a different
amount of variation among face images. These vectors can be
thought of as a set of features that characterize the variation
between face images [2].
1 Each image location contributes more or less to each
eigenvector, so that we can display the eigenvector as a sort of
ghostly face which we call an eigenface as shown inf figure 1.
Figure 1: Eigenface
5. Computing eigenfaces Results References
Eigenvectors and eigenfaces II
2 Each eigenface deviates from uniform gray where some facial
feature differs among the set of training faces; they are a sort
of map of the variations between faces.
3 Each individual face can be represented exactly in terms of a
linear combination of the eigenfaces. Each face can also be
approximated using only the ”best” eigenfaces.
4 That is those eigenfaces that have largest eigenvalues and
which therefore account for the most variance within the set
of face images.
5 The best k eigenfaces span an k dimensional subspace or
”facespace” of all possible images.
6. Computing eigenfaces Results References
Results I
1 Set of eigenvalues in descending order is shown in figure 2.
Figure 2: Spectrum of Co-variance
7. Computing eigenfaces Results References
Results II
2 Determine smallest eigenvalue λk such that
k
i=1 λi
d
j=1 λj
≥ 0.9.
3 In our case we determined k = 20.
4 Visualize the first k eigen vectors vi ∈ R361 as 19x19 images.
5 As shown below
a) b)
c) d)
e) f)
11. Computing eigenfaces Results References
Results VI
6 Randomly select 10 test images, compute their Euclidean
distances to all training images, sort (in descending order) and
plot the distances. Some of these are shown in figures.
u) v)
w) x)
13. Computing eigenfaces Results References
Results VIII
7 Consider the same 10 test images as above; in the lower
dimensional space, compute their Euclidean distances to all
the training images, sort the set of distances in descending
order and plot them; compare your plots. The comparison of
those plots are shown in the previous slide.
14. Computing eigenfaces Results References
Notes on Results
1 Images of faces, being similar in overall configuration, will not
be randomly distributed in this huge image space R361 and
thus can be described by a relatively low dimensional subspace
[2].
2 The main idea of PCA (or Karhunen-Lo`eve expansion [1]) is
to find the vectors that best account for the distribution of
face images within the entire image space.
3 These vectors define the subspace of the face images which is
referred to as ”face-space” as mentioned in earlier slides.
4 Each vector being vi ∈ R361. All such vectors are eigenvectors
of the covariance matrix corresponding to the original face
images.
15. Computing eigenfaces Results References
References
K. Karhunen, ¨Uber lineare Methoden in der
Wahrscheinlichkeitsrechnung, Annales Academiae scientiarum
Fennicae. Series A. 1, Mathematica-physica, 1947.
M. A. Turk and A. P. Pentland, Face recognition using
eigenfaces, in Computer Vision and Pattern Recognition, 1991.
Proceedings CVPR’91., IEEE Computer Society Conference
on, IEEE, 1991, pp. 586–591.