IJERA (International journal of Engineering Research and Applications) is International online, ... peer reviewed journal. For more detail or submit your article, please visit www.ijera.com
Driving Behavioral Change for Information Management through Data-Driven Gree...
Hl2413221328
1. MandeepkaurSandhuand, Ajay Kumar Dogra / International Journal of Engineering Research
and Applications (IJERA) ISSN: 2248-9622 www.ijera.com
Vol. 2, Issue4, July-august 2012, pp.1322-1328
Metamorphism using Warping and Vectorization Method for
Image:Review
MandeepkaurSandhuand Ajay Kumar Dogra.
Department of Computer Science Engineering, Beant college of Engineering and Management, Gurdaspur.
Abstract:
Face recognition presents a challenging difficultly government buildings, and ATMs (automatic teller
in the field of image analysis and computer vision, machines), and to secure computers and mobile
and as such has received a great deal of devotion phones.Computerized facial recognition is based on
over the last few years because of its many tenders capturing an image of a face, extracting features,
in various domains. In this paper, an overview of comparing it to images in a database, and identifying
some of the well-known methods. The applications matches. As the computer cannot see the same way
of this technology and some of the difficulties as a human eye can, it needs to convert images into
plaguing current systems with regard to this task numbers representing the various features of a face.
have also been provided. This paper also mentions The sets of numbers representing one face are
some of the most recent algorithms developed for compared with numbers representing another face.
this purpose and attempts to give an idea of the The excellence of the computer recognition system is
state of the art of face recognition technology. hooked onthe quality of the image and mathematical
algorithms used to convert a picture into numbers.
Keywords: Face recognition, biometric, algorithms Important factors for the image excellence are light,
background, and position of the head. Pictures can be
INTRODUCTION: taken of a still or moving subjects. Still subjects are
Interest in face recognition is moving toward photographed, for example by the police (mug shots)
uncontrolledor moderately controlled environments, or by specially placed security cameras (access
in that either theprobe or gallery images or both are control). However, the most challenging application
assumed to be acquiredunder uncontrolled is the ability to use images captured by surveillance
conditions. Also of interest are morerobust similarity cameras (shopping malls, train stations, ATMs), or
measures or, in general, techniques todetermine closed-circuit television (CCTV). In many cases the
whether two facial images correctly match, subject in those images is moving fast, and the light
i.e.,whether they belong to the same person.An and the position of the head is not optimal.
important real-life application of interest is Face Recognition
automatedsurveillance, where the objective is to Challenges
recognize and trackpeople who are on a watch list. In Physical appearance
this open world applicationthe system is tasked to Acquisition geometry
recognize a small set of people whilerejecting Imaging conditions
everyone else as being one of the wanted Compression artifacts
people.Traditionally, algorithms have been developed 1. Face Detection
for closedworld applications. Theassumption is that Face detection task: to identify and locate
the probe image and its closest image inthe gallery human faces in an image regardless of their
belong to the same person. The information obtained position, scale, in plane rotation, orientation,
by this search is then used to dorecognition. pose (out of plane rotation), and
Currently, the technology is used by police, forensic illumination.
scientists, governments, private companies, the The first step for any automatic face
military, and casinos. The police use facial recognition system
recognition for identification of criminals. Face detection methods:
Companies use it for securing access to restricted
areas. Casinos use facial recognition to eliminate Knowledge-based: Encode human knowledge of
Cheaters and dishonest money counters. Finally, in what constitutes a typical face (usually, the
the United States, nearly half of the states use relationship between facial features)
computerized identity verification, while the National
Center for Missing and Exploited Children uses the Feature invariant approaches: Aim to find
technique to find missing children on the Internet. In structural features of a face that exist even when the
Mexico, a voter database was compiled to prevent pose, viewpoint, or lighting conditions vary
vote fraud. Facial recognition technology can be used
in a number of other places, such as airports,
1322 | P a g e
2. MandeepkaurSandhuand, Ajay Kumar Dogra / International Journal of Engineering Research
and Applications (IJERA) ISSN: 2248-9622 www.ijera.com
Vol. 2, Issue4, July-august 2012, pp.1322-1328
Template matching methods:Several standard Face Recognition
patterns stored to describe the face as a whole or the In general, face recognition systems proceed
facial features separately. The most direct method by identifying the face in the scene, thus estimating
used for face recognition is the matching between and normalizing for translation, scale, and in-plane
the test images and a set of training images rotation. Many approaches to finding faces are based
based on measuring the correlation. The similarity on weak models of the human face that model face
is obtained by normalize cross correlation. shape in terms of facial texture.
Once a prospective face has been localized, the
Appearance-based methods:The models (or approaches to face recognition then divided into two
templates) are learned from a set of training images categories:
which capture the representative variability of facial Face appearance
appearance Face geometry
Framework of Face recognition
Face Region/
position of facial
feature
Face Detection
Find
Face Recognition
1. Face Region
Identify the person
2. Facial
Feature
Face Recognition: Advantages
Photos of faces are widely used in passports Authentication systems. This is only a first
and driver’s licenses where the possession challenge in a long list of technical
authentication protocol is augmented with a challenges that are associated with robust
photo for manual inspection purposes; there face authentication
is wide public acceptance for this biometric Face currently is a poor biometric for use in
identifier a pure identification protocol
An obvious circumvention method is
disguise
Face recognition systems are the least There is some criminal association with face
intrusive from a biometric sampling point of identifiers since this biometric has long been
view, requiring no contact, nor even the used by law enforcement agencies
awareness of the subject (‘mugshots’).
The biometric works, or at least works in 2. Face Recognition Algorithms
theory, with legacy photograph data-bases, 3.1 Principal Component Analysis (PCA)
videotape, or other image sources Derived from Karhunen-Loeve's
Face recognition can, at least in theory, be transformation. Given an s-dimensional vector
used for screening of unwanted individuals representation of each face in a training set of
in a crowd, in real time images, Principal Component Analysis (PCA) tends
It is a fairly good biometric identifier for to find a t-dimensional subspace whose basis vectors
small-scale verification applications correspond to the maximum variance direction in the
Face Recognition: Disadvantages original image space. This new subspace is normally
lower dimensional (t<<s). If the image elements are
A face needs to be well lighted by controlled considered as random variables, the PCA basis
light sources in automated face vectors are defined as eigenvectors of the scatter
matrix.
1323 | P a g e
3. MandeepkaurSandhuand, Ajay Kumar Dogra / International Journal of Engineering Research
and Applications (IJERA) ISSN: 2248-9622 www.ijera.com
Vol. 2, Issue4, July-august 2012, pp.1322-1328
Principal component analysis (PCA) is a data in a way which finestdescribes the variance in
mathematical procedure that uses an orthogonal the data. If a multivariate dataset is pictured as a set
transformation to convert a set of observations of of coordinates in a high-dimensional data space (1
possibly correlated variables into a set of values of axis per variable), PCA can supply the user with a
linearly uncorrelated variables called principal lower-dimensional picture, a "shadow" of this object
components. The number of principal components is when viewed from its (in some sense) most
less than or equal to the number of original variables. informative viewpoint. This is done by using only the
This transformation is defined in such a way that the first few principal components so that the
first principal component has the dimensionality of the transformed data is reduced.
majorconceivablevariance (that is, accounts for as PCA is meticulously related to factor analysis. Factor
much of the variability in the data as possible), and analysis normally incorporates more domain specific
each following component in turn has the highest suppositions about the underlying structure and
variance possible under the constraint that it be solves eigenvectors of a slightly different matrix.
orthogonal to (i.e., uncorrelated with) the preceding
components. Principal components are guaranteed to 3.2 Independent Component Analysis (ICA)
be independent only if the data set is jointly normally Independent Component Analysis (ICA)
distributed. PCA is sensitive to the relative scaling of minimizes both second-order and higher-order
the original variables. Depending on the field of dependencies in the input data and attempts to find
application, it is also named the discrete Karhunen– the basis along which the data (when projected onto
Loève transform (KLT), the Hotelling transform or them) are - statistically independent. Bartlett et al.
proper orthogonal decomposition (POD). provided two architectures of ICA for face
PCA was invented in 1901 by Karl Pearson recognition task: Architecture I - statistically
[6].Now it is mostly used as a tool in exploratory data independent basis images, and Architecture II -
analysis and for creatingpredictive models. PCA can factorial code representation. Independent component
be done by eigenvalue decomposition of a data analysis (ICA) is a computational method from
covariance (or correlation) matrix or singular value statistics and signal processing which is a special
decomposition of a data matrix, usually after mean case of blind source separation. ICA seeks to separate
centering (and normalizing or using Z-scores) the a multivariate signal into additive subcomponents
data matrix for each attribute[7]. The results of a supposing the mutual statistical independence of the
PCA are usually discussed in terms of component non-Gaussian source signals. The general framework
scores, sometimes called factor scores (the of ICA was introduced in the early 1980s (Hérault
transformed variable values corresponding to a and Ans 1984; Ans, Hérault and Jutten 1985; Hérault,
particular data point), and loadings (the weight by Jutten and Ans 1985), but was most clearly stated by
which each standardized original variable should be Pierre Comon in 1994 (Comon1994). For a good text,
multiplied to get the component score)[7]. see Hyvärinen, Karhunen and Oja (2001).ICA finds
PCA is the simplest of the true eigenvector-based the independent components (aka factors, latent
multivariate analyses. Often, its operation can be variables or sources) by maximizing the statistical
thought of as revealing the internal structure of the independence of the estimated components.
Fig:1 Represents PCA method
We may choose one of many ways to define the ICA algorithms. The two broadest definitions of
independence, and this choice governs the form of independence for ICA are
1324 | P a g e
4. MandeepkaurSandhuand, Ajay Kumar Dogra / International Journal of Engineering Research
and Applications (IJERA) ISSN: 2248-9622 www.ijera.com
Vol. 2, Issue4, July-august 2012, pp.1322-1328
JADE, but there are many others.In general, ICA
1) Minimization of Mutual Information cannot identify the actual number of source signals, a
2) Maximization of non-Gaussianity uniquely correct ordering of the source signals, nor
The Minimization-of-Mutual information (MMI) the proper scaling (including sign) of the source
family of ICA algorithms uses measures like signals. ICA is important to blind signal separation
Kullback-leiber Divergence and maximum-entropy. and has many practical applications. It is closely
The Non-Gaussianity family of ICA algorithms, related to (or even a special case of) the search for a
motivated by the central limit theorem, uses kurtosis factorial code of the data, i.e., anew vector-valued
and Negentrogy. Typical algorithms for ICA use representation of each data vector such that it gets
centering, whitening (usually with the eigenvalue uniquely encoded by the resulting code vector (loss-
decomposition), and dimensionality reduction as free coding), but the code components are
preprocessing steps in order to simplify and reduce statistically independent.
the complexity of the problem for the actual iterative Comparison of PCA and ICA
algorithm. Whitening and dimension reduction can be PCA
achieved with principal component analysis or – Focus on uncorrelated and Gaussian components
singular value decomposition [2]. – Second-order statistics
Independent component analysis (ICA) is a – Orthogonal transformation
computational method for separating a multivariate
signal into additive subcomponents supposing the ICA
mutual statistical independence of the non-Gaussian – Focus on independent and non-Gaussian
source signals. It is a special case of separation. components
Whitening ensures that all dimensions are treated – Higher-order statistics
equally a priori before the algorithm is run. – Non-orthogonal transformation
Algorithms for ICA include infomax, FastICA, and
We do not know
Both unknowns
Some o
ptimizatio
n
Function is require
d!!
categorize among classes. For all samples of all
3.3 Linear Discriminant Analysis (LDA) classes thebetween-class scatter matrix SBand the
Linear Discriminant Analysis (LDA) finds within-class scatter matrix SWare defined.The goal is
the vectors in the underlying space that best to maximize SBwhile minimizing SW, in other
1325 | P a g e
5. MandeepkaurSandhuand, Ajay Kumar Dogra / International Journal of Engineering Research
and Applications (IJERA) ISSN: 2248-9622 www.ijera.com
Vol. 2, Issue4, July-august 2012, pp.1322-1328
words, maximize the ratio det|S|/det|SW.
This ratio is maximized when the column vectors of N
the projection matrix are the eigenvectors of (SW^-1 2. SB= ∑ ni(mi-m)(mi-m)T
× SB).Linear discriminant analysis (LDA) and the i=l
related Fisher's linear discriminantare dimensionality whereniis the number of images in the class, mi is the
before later classification.LDA is closely linked to mean of the images in the class and m is the mean of
ANOVA (analysis of variance) and regressionwhich all the images.
also attempt to express one dependent variable as a 3.Solve the generalized eigenvalue problem
linear combination of other structures or SBV = ᶺSWV
extents[10][3].In the other two methods however, the
dependent variable is a numerical quantity, while for 3.4 Hidden Markov Models (HMM)
LDA it is a categorical variable (i.e. the class label). Hidden Markov Models (HMM) are a set of
Logistic regression andprobitregression are more statistical models used to characterize the statistical
similar to LDA, as they also explain a categorical properties of a signal. HMM consists of two
variable. These other methods are preferable in interrelated processes:
applications where it is not reasonable to assume that (1) An underlying, unobservable Markov chain with
the independent variables are normally distributed, a finite number of states, a state transition probability
which is a fundamental assumption of the LDA matrix and an initial state probability distribution and
method.LDA is also closely related to principal (2)A set of probability density functions associated
component analysis (PCA) and factor analysis in that with each state.
they both look for linear combinations of variables
which best explain the data [3]. LDA explicitly A Hidden Markov Models (HMM) is a finite
attempts to model the difference between the classes state machine which has some fixed number of states.
of data. PCA on the other hand does not take into It provides a probabilistic framework for modeling a
account any difference in class, and factor analysis time series of multivariate observations.
builds the feature combinations based on differences Hidden Markov models were introduced in the
rather than similarities. Discriminant analysis is also beginning of the 1970‟s as a tool in speech
different from factor analysis in that it is not an recognition. This model based on statistical methods
interdependence technique: a distinction between has become progressively popular in the last several
independent variables and dependent variables (also years due to its strong mathematical structure and
called criterion variables) must be made.LDA works theoretical basis for use in a wide range of
when the measurements made on independent applications [1].
variables for each observation are continuous A hidden Markov model is a doubly
quantities. When dealing with categorical stochastic process, with an underlying stochastic
independent variables, the equivalent technique is process that is not observable (hence the word
discriminant correspondence analysis[7][8]. hidden), but can be observed through another
stochastic process that creates the sequence of
Basic steps of LDA algorithm [3] observations. The hidden process consists of a set of
LDA uses PCA subspace as input data, i.e., conditions connected to each other by changes with
matrix V obtained from PCA. The advantage is probabilities, while the observed process consists of a
cutting the eigenvectors in matrix V that are not set of outputs or observations, each of which may be
important for face recognition (this significantly emitted by each state according to some probability
improves computing performance). LDA considers density function (pdf). Depending on the nature of
between and also within class correspondence of this pdf, several HMM classes can be distinguished.
data. It means that training images create a class for If the observations are naturally discrete or quantized
each subject, i.e., one class = one subject (all his/her using vector quantization [11].
training images).
3.5Face recognition using line edge map (LEM)
1.Determine LDA subspace (i.e. determining the line This algorithm describes a new technique
in Fig. 2) from training data. Calculate the within based on line edge maps (LEM) to accomplish face
class scatter matrix recognition. In addition, it proposes a line matching
c technique to mark this task possible. In opposition
with other algorithms, LEM uses physiologic features
Sw∑i ,si= ∑(x-mi) (x-mi)T
i=lx€xi
from human faces to solve the problem; it mainly
uses mouth, nose and eyes as the most characteristic
wheremiis the mean of the images in the class and C
ones.
is the number of classes.Calculate the between class
scatter matrix
In order to degree the similarity of human faces the
face images are initially converted into gray-level
1326 | P a g e
6. MandeepkaurSandhuand, Ajay Kumar Dogra / International Journal of Engineering Research
and Applications (IJERA) ISSN: 2248-9622 www.ijera.com
Vol. 2, Issue4, July-august 2012, pp.1322-1328
pictures. The images are encoded into binary edge certain parameter settings LDA produced the worst
maps using Sobel edge detection algorithm. This recognition rates from among all experiments.
system is muchrelated to the way human beings Experiments with their proposed methods
perceive other people faces as it was stated in many PCA+SVM and LDA+SVM produced a better
psychological studies. The main advantage of line maximum recognition rate than traditional PCA and
edge maps is the low sensitiveness to illumination LDA methods. Combination LDA+SVM produced
changes, because it is an intermediate-level image more consistent results than LDA alone. Altogether
representation derived from low-level edge map they made more than 300 tests and achieved
representation.[9] The algorithm has another maximum recognition rates near 100% (LDA+SVM
important improvement, it is the low memory once actually 100%)[8].Chengjun Liu and Harry
requirements because the sensible of data used. In Wechsler address the relative usefulness of the
there is an example of a face line edge map; it can be independentcomponent analysis for Face
observed that it keeps face features but in a very Recognition. The sensitivity analysis suggests that for
abridgedlevel [13]. enhanced performance ICA should be carried out in a
compressed and whitened space where most of the
characteristic information of the original data is
conserved and the small trailing eigenvalues
unwanted. The dimensionality of the compressed
subspace is decided founded on the eigenvalue
spectrum from the training data. In the result their
discriminant analysis shows that the ICA criterion,
when carried out in the properly compressed and
whitened space, performs better than the eigenfaces
and Fisherfaces methods for face recognition[4].
5. Conclusions:
Face recognition is a challenging problem in
the field ofimage analysis and computer vision that
has received a great deal of attention over the last few
years because of its many applications in various
4. Face Recognition Techniques domains. Research has been conducted dynamically
Taranpreet Singh Ruprahhas presented a in this area for the past four agesor so, and though
face recognition systemusing PCA with neural huge progress has been made, encouraging results
networks in the context of face verification and face have been obtained and current face recognition
recognition using photometric normalization for systems have reached a certain degree of maturity
association. The experimental results show the N.N. when operating under constrained conditions;
Euclidean distance rules using PCA for overall however, they are far from achieving the ideal of
presentation for verification. However, for being able to perform adequately in all the various
recognition, E.D.classifier gives the highest accuracy situations that are commonly encountered by
using the original face image. Thus, applying applications utilizing these techniques in practical
histogram equalization techniques on the face image life.
do not give much impact to the performance of the
system if conducted under controlled 6. References:
environment[17].Byongjoo Ohproposes a face [1] A. El-Yacoubi, M. Gilloux,R. Sabourin,
recognition algorithmand presented results of Member, IEEE, and C.Y. Suen, Fellow,
performance test on the ORL database. The PCA IEEE , An HMM-Based Approach for Off-
algorithm is first tried as a reference performance. Line Unconstrained Handwritten Word
Then PCA+MLNN algorithm was proposed and Modeling and Recognition,IEEE
tested in the same environments. The PCA+ MLNN Transactions On Pattern Analysis And
algorithm shows better performance, and resulted Machine Intelligence,21(8),1999.
in95.29% recognition rate. The introduction of [2] A.Hyvärinen, J.Karhunen and
MLNN enhances the classification performance and E.Oja,Independent Component Analysis,
providesrelatively robust performance for the Signal Processing, 36(3) 2001, 287–314
variation of light. However this result does not [3] B. Kamgar-Parsi and B. Kamgar-Parsi,
guarantee the PCA+ MLNN algorithm is always Methods of FacialRecognition,US Patent 7,
better than PCA[12].Jan Mazanec and others 2010 684,595.
experiments with FERET database imply that [4] C. Liu and H. Wechsler,Comparative
LDA+ldaSoft generally achieves the highest Assessment of Independent Component
maximum recognition rate. In their experiment they Analysis (ICA) for Face
show, LDA alone is not suitable for practical use. At Recognition,Appears in the Second
1327 | P a g e