SlideShare a Scribd company logo
International Journal of Electrical and Computer Engineering (IJECE)
Vol. 10, No. 3, June 2020, pp. 3307~3314
ISSN: 2088-8708, DOI: 10.11591/ijece.v10i3.pp3307-3314  3307
Journal homepage: http://ijece.iaescore.com/index.php/IJECE
Analysis on techniques used to recognize and identifying
the Human emotions
Praveen Kulkarni1
, Rajesh T M2
1
Research Scholar at Dayananda Sagar University, India
1
Department of Computer Science and Engineering, Faculty of Engineering, CHRIST (Deemed to be university), India
2
Department of Computer Science and Engineering, Dayananda Sagar University, India
Article Info ABSTRACT
Article history:
Received Jul 26, 2019
Revised Dec 17, 2019
Accepted Jan 8, 2020
Facial expression is a major area for non-verbal language in day to day life
communication. As the statistical analysis shows only 7 percent of
the message in communication was covered in verbal communication while
55 percent transmitted by facial expression. Emotional expression has been
a research subject of physiology since Darwin’s work on emotional
expression in the 19th century. According to Psychological theory
the classification of human emotion is classified majorly into six emotions:
happiness, fear, anger, surprise, disgust, and sadness. Facial expressions
which involve the emotions and the nature of speech play a foremost role in
expressing these emotions. Thereafter, researchers developed a system based
on Anatomic of face named Facial Action Coding System (FACS) in 1970.
Ever since the development of FACS there is a rapid progress in the domain
of emotion recognition. This work is intended to give a thorough
comparative analysis of the various techniques and methods that were
applied to recognize and identify human emotions. This analysis results
will help to identify proper and suitable techniques, algorithms and
the methodologies for future research directions. In this paper extensive
analysis on various recognition techniques used to identify the complexity in
recognizing the facial expression is presented.
Keywords:
Classification
Face detection
Feature extraction
Copyright © 2020 Institute of Advanced Engineering and Science.
All rights reserved.
Corresponding Author:
Praveen Kulkarni,
Department of Computer Science and Engineering, Faculty of Engineering,
CHRIST (Deemed to be University),
Bangalore, India.
Email: praveencs024@gmail.com
1. INTRODUCTION
Emotions play a very important function in many fields likes forensic crime detection,
psychologically affected patients, students mentoring in academics and victim observation in court, etc.
However, so much of the growth happened in this technological domain, still, we can find lots of drawbacks
and loopholes in terms of accuracy in various results we found in our survey work. There are a number of
jobs performed by individuals and groups. The goal of facial expression is to identify oneself by observing
a single image or multiple images, which is the emotion that the image shows. The human face has several
components such as eye, nose, mouth, Brow and a few others. Based on the movement of those components
and change of shape and sizes emotions may be extracted in various ways. Reorganization of emotion in
human has become a greatest challenge faced in the interaction of computer and humans. Most of the effort
on emotion recognition focuses on information extracted from visual or audio separately. Numerous surveys
have been published in different areas, analyzing and recognizing the gestures and movements of human face
and tracking of eye is still an unsolved problem with respect to accuracy rate. Many researchers in this area
are been set as a benchmark and basement for many of the current research topics. Researcher in his work [1]
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3307 - 3314
3308
has dedicated his career to the pursuit of emotions, focusing mainly on six different emotions surprise,
sadness, fear, anger, disgust, happiness. However, in the year 1990s, he dilated his list of basic emotions,
together with a variety of positive and negative emotions including amusement, Dislike, awkwardness, relief,
pleasure, satisfaction, guilt and pride
Understanding facial expression can be seen as a common phenomenon but has become the most
essential indication in the history of emotional psychology. More recently, research on facial expressions has
led to the emergence of new concepts, new techniques, and innovative results. In this case, scientists are
formulating alternative accounts, while previous versions are acquiring new interests. Darwin’s arguments
suggest that emotions have evolved to the point of acting as an instrument of communication between
individuals. In the year 2003 author Gao et L used dLHD and ID3 decision tree classification method
(Noh et al in 2007). Zhao and Pietikainen in 2009 used SVM and Song et al in 2010 used SVM for
the classification method. The researcher in his work specified that SVM classifier gives higest accuracy for
several facial expressions [2]. The unsupervised learning algorithm called LVQ-Learning Vector
Quantization (Bashyal in 2008) and MLP is also used for classification [3]. KNN algorithm (Poursaberi in
2012) is a method of classification through which relationship among the assessment models are estimated
during training. HMM classifier be the one of the statistical representation for categorize expression into
various types in face detection (Taylor in 2014). Classification and regression tree is machine learning
algorithm (Salman in 2016) used for classification using distance vectors. CNN (2016). MFFNN, DNN
(2015), MDC (Islam in 2018) are some of the classifiers used recently. With respect to many classifiers SVM
gives improved reorganization accurateness and also better classification. Compared to the past algorithm
used for classification presently the SVM classifier is greatly use classifier. Other classifiers like CART, pair
wise, Neural Network based are used in modern years compared to past decades
Emotion recognition systems realize applications in several fascinating areas. With the recent
advances in artificial intelligence, significantly automaton robots, the requirement for a strong expression
recognition system is obvious. The recognition of emotions plays a very important role within the recognition
of one’s own fondness and in turn, helps to form a sensitive and sensitive human-machine interface (HCI).
Additionally, the 2 main applications, particularly Artificial Intelligence and sensitive human machine
interface, Emotion recognition systems area unit utilized in an outsized variety of alternative domains like
telecommunications field, video gaming, video animation, psychiatry, in automotive security, educational
computer system and many others. Many algorithms have been suggested to develop systems/applications
that can detect emotions very well. Computer applications could communicate better by changing responses
based on the emotional state of human users in various interactions
Emotion detection using the facial components suggested before is still being used. It was mainly
observed that for emotion detection we need to do it by stage by stage. Preprocessing is the first stage, and
then comes the feature extraction and then the classification. As the year passed there has been lot of growth
and advancement in these three stages with different variety of algorithms. In the present observations more
number of features are extracted from the faces to verify the emotions.Facial emotion recognition field is
gaining a ton of attention and in past two decades with applications and modernization not solely within
the sensory activity and psychological feature sciences, however additionally in emotional computing and
computer animations. A much more modern study suggests that there are far more basic emotions than
antecedently believed. Within the study revealed in Proceedings of National Academy of Sciences,
researchers known 27 totally different classes of emotion. Instead of being entirely distinct, however,
the researchers found that individual’s expertise these emotions on a gradient. Recognition in the form of
facial emotion and expression has additionally been increased. Recently because of fast development of
machine learning and artificial intelligent (AI) techniques, as well as the human computer interaction,
a computer game (VR), augment reality (AR). Detection of facial expression in advanced driver assistant
systems and recreation is also found.
Although the technology for recognizing emotions is important and has evolved in different fields,
it is still the unanswered problem. Detecting the feeling of being human can be achieved through the use of
facial images, voice, and body shape. With this detection, the image of the face is the most frequent source
and to detect emotions above the entire frontal facial image are commonly used to detect emotions.
The procedure for recognizing emotions is not simple but complex because it extracts the appropriate
characteristics and Detecting emotions requires complex steps.The Facial Expression Recognition (FER)
consists of five phases as shown in Figure 1. Noise elimination/improvement is performed in the pre-
processing phase take an image or a sequence of images (a time series of images from neutral to an
expression) as an input face for further processing. Detection of facial components detects return on
investment in eyes, nose, cheeks, mouth, eyes, forehead, ear, front head, etc. The characteristic extraction
phase concerns the extraction of the characteristics from the ROI. Here, we discuss the work not included in
previous surveys, which has changed in the last two decades in the detection of emotions. The topics
discussed in the following sections are on the comparison between previously done work and the present
research being carried out.
Int J Elec & Comp Eng ISSN: 2088-8708 
Analysis on techniques used to recognize and identifying the Human emotions (Praveen Kulkarni)
3309
Figure 1. Phases of facial expression recognition
2. RESEARCH METHOD
Pre-treatment will be the first step in detecting a face. To get purely facial images with good
Intensity, proper size, and identical normalized shape we have to conduct pre-processing. To convert image
into a clean image of the normalize face for taking out the characteristic is the detection of characteristic
points, which rotate to get in line, identify and cut the facial region by means of a four-sided figure,
according to the copy of the face. A single image in the Detector faces can be categorized into four methods
based on Feature-based, Appearance-based, Knowledge-based and template based as shown in Figure 2.
Figure 2. Face detection methods
2.1. Feature based
Feature based: This technique is used to find faces by extracting structural options within the face.
Initially it will be trained as classifier and so to differentiate between which is facial and non-facial region.
Concept is our self-generated information of faces is to beat the bonds. This approach divided into many
steps and even photos with several faces they report a success rate of 94%.
2.2. Texture-based
Texture-Based: The recognition of the emotion is done on the plot characteristic extract from
the gray-level coexistence matrix, GLCM characteristics are extracted and Format with the support machine
carriers by diverse cores. Statistical and features that are taken from GLPM are given as
the input for classification to SVM. The detection rate is identified as 90% [4]. The algorithm detects
the characteristic points of a victimization abstraction filter technique by the data given by author [5].
The group indicates the victimization of the facial candidate. Initially it should be trained such that
the classifier can identify the facial region and non-facial region. This technique is based on an 85% detection
rate with one hundred images with different scales and direction of purpose.
2.3. Skin color based
Skin color Based: As expressed by author in [6], algorithms are compared using 3 colors.
This mixture of color leads to face detection algorithm with color replacement. Once the skin region is
obtained, the expression of the eyes is eliminated by obscuring the background colors and the image is
reshaped in grayscale and the binary image as an acceptable limit. The rate of detection is accurate to 95.18.
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3307 - 3314
3310
2.4. Multiple feature
Multiple Feature: Adaboost has selected the multiple features of the face region algorithm [7].
The face is divided into different regions by different orthogonal regions. Components like nose, eyes, mouth
are identified for the purpose of analysis. Area where the combination was used and given as input for
the classifier. In this phase it will choose one of the best arrangements and then modify the weight in
the coming process. Detection routine in many characters is great than the main component of the common
orthogonal part analysis method.
2.5. Template matching methods
Template matching methods: when the image is given as an input this method will match it with
the defined template and identify the face. For example, we can divide the face into many parts, like eyes,
nose, mouth. Also, we can design a model in which can be helpful for the edge detection. This approach is
easy to implement, however, it is not sufficient for detecting of face. However, shape of templates is planned
and managed in these issues.
2.6. Appearance-based methods
Appearance-based methods: Designed mainly for face detection. The appearance-based technique
depends on the number of people authorized coaching face pictures to search out face models. This type of
approach is best compared to alternative when we think of performance. Generally, the appearance-based
technique has faith in techniques from applied mathematics analysis and machine learning to search out
the relevant characteristics of face pictures. This technique conjointly employed in feature extraction for
face recognition.
3. RESULTS AND DISCUSSIONS
Effect of light should be eliminated to scale for a fixed size image [8]. To identifies the face on
the image automatically facial points should be detected. Algorithmic scheme proposed [9] with
the technique [10] is commonly used for face detection. Algorithmic rule has been tested within
the Cohn-Kanade info, also as in technology & Mathematical Modeling [11] info. Detection of fourteen
points of facial expression with a median exactness of 86% in Cohn- Kanade info and 83% within the info of
mathematical modeling.In Eigen face based algorithm of Yogesh Tayal et.al [12] applies to a mixture of
images taken with special illuminations and background, the size of the image and the time needed for
the algorithm is 4.5456 seconds. Here we can take Euclidean weight & distance of the input image.
With the help of data base comparing, characteristics and calculation of recognizing of face is done.
Knowledge based methods-this method mainly depends on rules that are set and it support human
information in discovering the face. For example, face has a nose and mouth at a bound distance and same
with the nose with one another. As the large drawback with this strategy is that the problem in constructing
associate degree acceptable set of rule. Several false positive foundations were very general and careful.
This type of approach alone is short andnot capable to search out several faces in numerous pictures.
The face, color and shape of the skin adapt to the size of the window and to the color signature to calculate
the color distance [13]. A face normally comprises nose, eyes & mouth with exact distance and their position
with each other. The major problem in the method comes when rules have to be defined. Tactics reached
93.4% detection with false positives up to 7.1%.
3.1. Template matching
Template Matching: Author in his paper [14] says it is based on local, statistical and local
characteristics Binary models (LBP) for the recognition of the independent expression of the person.
The detection with help of MMI is equivalent to 86.97% and it is 85.6% in the JAFFE database.
3.2. Active shape model
Active Shape Model: Author in his paper [15], says the sequence of images is performed frankly
Model of wire structure and algorithm for support of vector and tack of active appearance.
The machine (SVM) is used for classification. The model deforms the shape for the final frame when
the expression changed.
3.3. Distribution features
Distribution Features: Author in his paper [16] paper, the images were taken and five significant
parts were cut out the image that is made for extraction and, therefore, stores the specific eigenvectors
for expressions. The eigenvectors are calculated and input facial image is accepted. SVM is used
for classification.
Int J Elec & Comp Eng ISSN: 2088-8708 
Analysis on techniques used to recognize and identifying the Human emotions (Praveen Kulkarni)
3311
3.4. Feature extraction
Feature Extraction: It is a way of recognizing face. Several steps are involved such as Reduction,
extraction of facial appearance and collection of characteristics. Dimensional decrease is a significant task in
the form of recognition system. Figure 3 shows the different feature extraction with their recognition rate.
Discrete Cosine Transform - Way in of image DCT is applied and features such as mouth, eyes are extracted
from image [17] and classifier used is Ada boost with the recognition rate of 75.93%.
Figure 3. Feature extraction and recognition rate
3.5. Gabor filter
Gabor Filter: To separate different expression Gabor filters are used. The database of expressions
used to recognize the recognition system was JAFFE and recognition rate is above 93%.
3.6. PCA
PCA: Extraction of facial features by analyzing the main components according to the author [18]
Analysis of the main weighted components (WPCA) the method is used in merging multiple functions for
the classification. The Euclidean distance is calculated to obtain the resemblance involving the models and
therefore the recognition of the facial expression is done with the nearest algorithm. The recognition rate is
88.25% LDA: Linear discriminant method: was proposed for calculating discriminating vectors with two
stage procedure [19]. Vectors are combined together using the K-means grouping method with each one
change sample and 91% recognition was proposed in this classifications:
3.6.1. Hidden markov model
Hidden Markov Model: Hidden Markov model was developed, categorizing the higher emotional
states as involved, insecure, disagreeing, hopeful and discouraging, starting from the lowest level.
Author view is Emotional categorization is well formed to know the state of emotions, so it works as
information, so a corresponding assignment is assigned between facial features. Expert Rule is used for
segmenting & for recognizing the emotion state of number of video sequences. In this the probabilistic frame
of modeling variable time sequence and the union of detection calculation is performed in actual time.
Unsafe emotion reorganization is 87 %.
3.6.2. Neural networks
Neural Networks: There is an arrangement of 2 Methods, the extraction of features and the neural
network. There are two phases in face detection & cataloging. Preprocessing of image is done to reduce
the time taken and increase the time Image quality. Here neural network trains with face and not the face
pictures as of the Yale Face information. Pictures within the knowledge set area unit 27x18 eighteen pixels
the pictures area unit in Grayscale in wrangle forma and its speed of come is 84.4 percent.
3.6.3. Support vector machine
Support Vector Machine: Scanning of each video frames in real time for the first time will detect
the front face, so the faces are resized in patches of images of the same size together by means of a bench
Gabor energy filters. Finally, author developed a system that gives the style of completely different
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3307 - 3314
3312
countenance codes at twenty-four frames per second and animates the facial look generated by the computer;
it’s additionally the secret writing of absolutely computerized facial action. Therefore, the recognition
rate is 93%.
3.6.4. AdaBoost
AdaBoost: The front face may be a sequence of pictures classified in seven categories as surprise,
anger, fear, disgust neutral, joy, sadness. Here the popular technique is dead while not feature block.
The color of the skin is detected during this document. The facial expression area unit extracted and at last
the facial expressions area unit distinct from the shifting the characteristics of the world mark on the face
from the formula projected by the author victimization Classifier supported AdaBoost. Therefore, the rate of
accuracy is 90%. As shown in Figure 4 authors have different opinion for classifications and accuracy
rate [20]. Table 1 shows the analysis of different methods and technique used for emotion detection with
their accuracy.
Figure 4. Classification and accuracy rate
Table 1. Analysis of different methods and its accuracy
Methods Recognition
Accuracy(%)
No. Of Expressions
Recognized
Advantages And Major Contribution Ref. and
Year
CNN Not reported Not reported -Multiscale feature extractor
- In plane pose variation
[21] 2002
dLHD and LEM 86.6 3 Oriented structural features [22] 2003
RVM-relevance
vector machine
90.84 - Recognition in Static images [23] 2005
Multi stream
HMMs
- - Recognition error reduced to 44 % compared to
Single stream
[24]2006
ID3 decision tree 75 6 -Cost effective with respect to accuracy and with
speed
[25] 2007
LVQ and GF 88.86 Not reported Efficient emotion detection [26]2008
SVM and GASM
93.85
6 Learning using
Adaboost
-Selection was flexible [27]2009
5 parallel bayesian
classifiers
- - -Multi classification is achieved
-Static and real time video
[28] 2010
SVM 82.5 6 -Based on distance features
-Effective recognition highest
CRR
[29]2011
LBP,SVM 95.84 6 -Image based recognition
-Spatial temporal feature
[30]2012
Lines of connectivity 93.8 3 Triangular face using LC and
geometric approach
[31]2013
GF,MFFNN 94.16 7 Computational cost is very less [32]2014
SVM and DCT 98.63 6 Fast and high accuracy [33]2015
OSELM-SC 95.17 6 -Online sequential extreme learning machine
-Good recognition accuracy and robustness
[34]2017
DCT,GF,SVM More than 90
percent
8 Good Detection rate in various light source [35]2018
CNN,SVM - - Better classification compared to state of art CNN
-Deep learning Features
[36]2019
Int J Elec & Comp Eng ISSN: 2088-8708 
Analysis on techniques used to recognize and identifying the Human emotions (Praveen Kulkarni)
3313
4. CONCLUSION
The purpose of this paper is to identify face detection problems and challenges and compare
numerous ways for face detection. There’s a major advancement in this area because it is helpful in
real-world application product. Various face detection techniques are summarized, and eventually methods
are mentioned for face detection, their options, advantages and accuracy. This paper compares algorithms
which are helpful for emotion detection based on their accuracy and recent development. There is still an
honest scope for work to urge economical results by combining or raising the choice of options for detection
of face pictures in spite of intensity of background color or any occlusion.The important feature
enhancements which are discussed from recent papers are emotion detection from the side views and
different parameters for real time applications such as medical, robotics, forensic section and many more.
REFERENCES
[1] P. Ekman, and W. V Friesen, “Facial action coding system: Manual,” Palo Alto, Calif: Consulting Psychologists
Press, 1978.
[2] I. M. Revina, and W. R. Sam Emmanuel, “A Survey on Human Face Expression Recognition Techniques,” Journal
of King Saud University-Computer and Information Sciences, 2018.
[3] Ming-Hsuan Yang, D. J. Kriegman, and N. Ahuja, “Detecting Faces in Images: A survey,” IEEE Transactions on
Pattern Analysis and Machine Intelligence, vol. 24, no. 1, pp. 34-58, 2002.
[4] A. Punitha, and M. K. Geetha, “Texture Based Emotion Recognition from Facial Expressions using Support Vector
Machine,” Int. J. Comput. Appl, vol. 80, no. 5, pp. 1-5, 2013.
[5] K. C. Yow, and R. Cipolla, “Feature-Based Human Face Detection,” Image and Vision Computing, vol. 15, no. 9,
pp. 713-735, 1997.
[6] S. Kr. Singh, D. S. Chauhan, and M. Vatsa, and R. Singh, “A Robust Skin Color Based Face Detection Algorithm,”
Tamkang Journal of Science and Engineering, vol 6, no. 4, pp. 227-234, 2003
[7] J. Li, G. Poulton, Y. Guo, and Rong-Yu Qiao, “Face Recognition Based on Multiple Region Features,” DICTA,
pp. 69-78, 2003.
[8] R. Patil, and V. Sahula, and AS Mandal, “Automatic Detection of Facial Feature Points in Image Sequences,”
International Conference on Image Information Processing, pp. 1-5, 2011.
[9] W. Kienzle, et al., “Face Detection-Efficient and Rank Deficient,” Advances in Neural Information Processing
Systems, pp. 673-680, 2005.
[10] P. Viola, M. Jones, “Robust Real-Time Object Detection,” International Journal of Computer Vision, vol. 4,
pp. 34-47, 2001.
[11] G Hemalatha, and CP Sumathi, “A Study of Techniques for Facial Detection and Expression Classification,”
International Journal of Computer Science and Engineering Survey, vol. 5, no. 2, 2014.
[12] Y. Tayal, P. K. Pandey and D. B. V. Singh, “Face Recognition using Eigenface,” International Association of
Scientific Innovation and Research (IASIR), vol. 3, no. 1, pp. 50-55, 2013.
[13] Y-Buhee and sukhanlee, “Robust Face Detection Based on Knowledge‐Directed Specification of Bottom‐Up
Saliency,” ETRI journal, vol. 33, no. 4, 2011.
[14] C. Shan, S. Gong, and P. W. McOwan, “Facial Expression Recognition Based on Local Binary Patterns:
A Comprehensive Study,” Image and vision Computing, vol. 27, no. 6. pp. 803-816, 2009.
[15] R. A. Patil, V. Sahula and AS Mandal, “Facial Expression Recognition in Image Sequences using Active Shape
Model and Support Vector Machine,” 2011 UKSIM 5th European Symposium on Computer Modeling and
Simulation, 2011.
[16] J. Kalita, and K. Das, “Recognition of Facial Expression using Eigenvector Based Distributed Features and
Euclidean Distance Based Decision Making Technique,” IJACSA, vol. 4, no. 2, 2013.
[17] Z. Zhang, X. Mu, L. Gao, “Recognizing Facial Expressions based on Gabor Filter Selection,” 2011 4th
International Congress on Image and Signal Processing, vol. 3, pp. 1544-1548, 2011.
[18] Z. Niu, and X. Qiu, “Facial Expression Recognition based on Weighted Principal Component Analysis and
Support Vector Machines,” 2010 3rd International Conference on Advanced Computer Theory and Engineering
(ICACTE), 2010.
[19] Li-Fen Chen, et al., “A New LDA-Based Face Recognition System which can Solve the Small Sample Size
Problem,” Pattern Recognition, vol. 33, no. 3, pp. 1713-1726, 2000.
[20] G Hemalatha, and CP Sumathi, “A Study of Techniques for Facial Detection and Expression Classification,”
International Journal of Computer Science and Engineering Survey, vol. 5, no. 2, pp. 27-37, 2014.
[21] B. Fasel, “Head-Pose Invariant Facial Expression Recognition using Convolutional Neural Networks,” Proceedings
of the 4th IEEE International Conference on Multimodal Interfaces, pp. 529, 2002
[22] D. Datcu and L. J. M. Rothkrantz, “Facial Expression Recognition with Relevance Vector Machines, 2005 IEEE
International Conference on Multimedia and Expo, pp. 193-196, 2005.
[23] P.S. Katsaggelos, and A.K Aggelos, “Automatic Facial Expression Recognition using Facial Animation Parameters
and Multistream HMMs,” IEEE Transactions on Information Forensics and Security, vol. 1, no. 1, pp. 3-11, 2006.
[24] P. Ekman, and W. V. Friesen, “Unmasking the Face: A Guide to Recognizing Emotions From Facial Clues,”
Malor Books, 2003.
[25] S. Noh, H. Park, Y. Jin, Jong-Il. Park, “Feature-Adaptive Motion Energy Analysis for Facial Expression
Recognition,” International Symposium on Visual Computing, pp. 452-463, 2007.
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3307 - 3314
3314
[26] Y. Sun, L. Yin, “Facial Expression Recognition based on 3D Dynamic Range Model Sequences,” European
Conference on Computer Vision, pp. 58–71, 2008.
[27] G. Zhao, and M. Pietik, “Boosted Multi-Resolution Spatiotemporal Descriptors for Facial Expression Recognition,”
Pattern Recognition Letters, vol. 30, no. 12, pp. 1117-1127, 2009.
[28] S. B. Kazmi, Qurat-ul-Ain, M. Ishtiaq, M. A. Jaffar, “Texture Analysis Based Facial Expression Recognition using
A Bank Of Bayesian Classifiers,” 2010 International Conference on Information and Emerging Technologies,
pp. 1-6, 2010.
[29] L. Zhang, and D. Tjondronegoro, “Facial Expression Recognition using Facial Movement Features,” IEEE
Transactions on Affective Computing, vol. 2, no. 4, pp. 219-229, 2011.
[30] A. Poursaberi, H. A. Noubari, M. Gavrilova, and S. N. Yanushkevich, “Gauss–Laguerre Wavelet Textural Feature
Fusion with Geometrical Information for Facial Expression Identification,” EURASIP Journal on Image and Video
Processing, vol. 2012, no. 1, pp. 17, 2012.
[31] M. R. Dileep, and A. Danti, “Lines Of Connectivity-Face Model for Recognition of the Human Facial
Expressions,” Int. J. Artif. Intell. Mech, vol. 2, no. 2, 2013
[32] E. Owusu, Y. Zhan, and Q. R. Mao, “A Neural-Adaboost Based Facial Expression Recognition System,” Expert
Systems with Applications, vol. 41, no. 7, pp. 3383-3390, 2014.
[33] S. Biswas, and J. Sil, “An Efficient Expression Recognition Method using Contourlet Transform,” Proceedings of
the 2nd International Conference on Perception and Machine Intelligence, pp. 167-174, 2015.
[34] A. Ucar, Y. Demir and C. Guzelis. “A New Facial Expression Recognition based on Curvelet Transform and
Online Sequential Extreme Learning Machine Initialized with Spherical Clustering,” Neural Computing and
Applications, vol. 27, no. 1, pp. 131-142, 2016.
[35] Hung-Hsu Tsai,Yi-Cheng Chang, “Facial Expression Recognition using A Combination of Multiple Facial Features
and Support Vector Machine,” Soft Computing, vol. 22, no. 13, pp. 4389-4405, 2018.
[36] FengyuanWang, Jianhua Lv, Guode Ying, Shenghui Chen, Chi Zhang, “Facial Expression Recognition from Image
based on Hybrid Features Understanding,” Journal of Visual Communication and Image Representation, vol. 59,
pp. 84-88, 2019.

More Related Content

What's hot

FACE DETECTION AND FEATURE EXTRACTION FOR FACIAL EMOTION DETECTION
FACE DETECTION AND FEATURE EXTRACTION FOR FACIAL EMOTION DETECTIONFACE DETECTION AND FEATURE EXTRACTION FOR FACIAL EMOTION DETECTION
FACE DETECTION AND FEATURE EXTRACTION FOR FACIAL EMOTION DETECTION
vivatechijri
 
Speaker independent visual lip activity detection for human - computer inte...
Speaker   independent visual lip activity detection for human - computer inte...Speaker   independent visual lip activity detection for human - computer inte...
Speaker independent visual lip activity detection for human - computer inte...
eSAT Journals
 
A prior case study of natural language processing on different domain
A prior case study of natural language processing  on different domain A prior case study of natural language processing  on different domain
A prior case study of natural language processing on different domain
IJECEIAES
 
Project Report, Design Project 2 - ICT Intervention for Improvisation of Mate...
Project Report, Design Project 2 - ICT Intervention for Improvisation of Mate...Project Report, Design Project 2 - ICT Intervention for Improvisation of Mate...
Project Report, Design Project 2 - ICT Intervention for Improvisation of Mate...Mannu Amrit
 
PRE-PROCESSING TECHNIQUES FOR FACIAL EMOTION RECOGNITION SYSTEM
PRE-PROCESSING TECHNIQUES FOR FACIAL EMOTION RECOGNITION SYSTEMPRE-PROCESSING TECHNIQUES FOR FACIAL EMOTION RECOGNITION SYSTEM
PRE-PROCESSING TECHNIQUES FOR FACIAL EMOTION RECOGNITION SYSTEM
IAEME Publication
 
Constructed model for micro-content recognition in lip reading based deep lea...
Constructed model for micro-content recognition in lip reading based deep lea...Constructed model for micro-content recognition in lip reading based deep lea...
Constructed model for micro-content recognition in lip reading based deep lea...
journalBEEI
 
IRJET - Automatic Emotion Recognition using Facial Expression: An Overview
IRJET - Automatic Emotion Recognition using Facial Expression: An OverviewIRJET - Automatic Emotion Recognition using Facial Expression: An Overview
IRJET - Automatic Emotion Recognition using Facial Expression: An Overview
IRJET Journal
 
Quantitative Analysis of Infant’s Computer-supported Sketch and Design of Dra...
Quantitative Analysis of Infant’s Computer-supported Sketch and Design of Dra...Quantitative Analysis of Infant’s Computer-supported Sketch and Design of Dra...
Quantitative Analysis of Infant’s Computer-supported Sketch and Design of Dra...
Mohd Syahmi
 
Real Time Facial Emotion Recognition using Kinect V2 Sensor
Real Time Facial Emotion Recognition using Kinect V2 SensorReal Time Facial Emotion Recognition using Kinect V2 Sensor
Real Time Facial Emotion Recognition using Kinect V2 Sensor
iosrjce
 
IRJET - Automatic Lip Reading: Classification of Words and Phrases using Conv...
IRJET - Automatic Lip Reading: Classification of Words and Phrases using Conv...IRJET - Automatic Lip Reading: Classification of Words and Phrases using Conv...
IRJET - Automatic Lip Reading: Classification of Words and Phrases using Conv...
IRJET Journal
 
Paper id 24201475
Paper id 24201475Paper id 24201475
Paper id 24201475IJRAT
 
IRJET - E-Assistant: An Interactive Bot for Banking Sector using NLP Process
IRJET -  	  E-Assistant: An Interactive Bot for Banking Sector using NLP ProcessIRJET -  	  E-Assistant: An Interactive Bot for Banking Sector using NLP Process
IRJET - E-Assistant: An Interactive Bot for Banking Sector using NLP Process
IRJET Journal
 
IRJET- Sentimental Analysis on Audio and Video
IRJET- Sentimental Analysis on Audio and VideoIRJET- Sentimental Analysis on Audio and Video
IRJET- Sentimental Analysis on Audio and Video
IRJET Journal
 
A Comprehensive Survey on Human Facial Expression Detection
A Comprehensive Survey on Human Facial Expression DetectionA Comprehensive Survey on Human Facial Expression Detection
A Comprehensive Survey on Human Facial Expression Detection
CSCJournals
 
A new alley in Opinion Mining using Senti Audio Visual Algorithm
A new alley in Opinion Mining using Senti Audio Visual AlgorithmA new alley in Opinion Mining using Senti Audio Visual Algorithm
A new alley in Opinion Mining using Senti Audio Visual Algorithm
IJERA Editor
 
Snibbles - Project Report
Snibbles - Project ReportSnibbles - Project Report
Snibbles - Project ReportMannu Amrit
 
Deep learning based facial expressions recognition system for assisting visua...
Deep learning based facial expressions recognition system for assisting visua...Deep learning based facial expressions recognition system for assisting visua...
Deep learning based facial expressions recognition system for assisting visua...
journalBEEI
 
Final Year Project - Enhancing Virtual Learning through Emotional Agents (Doc...
Final Year Project - Enhancing Virtual Learning through Emotional Agents (Doc...Final Year Project - Enhancing Virtual Learning through Emotional Agents (Doc...
Final Year Project - Enhancing Virtual Learning through Emotional Agents (Doc...Sana Nasar
 
IRJET- Review on Mood Detection using Image Processing and Chatbot using Arti...
IRJET- Review on Mood Detection using Image Processing and Chatbot using Arti...IRJET- Review on Mood Detection using Image Processing and Chatbot using Arti...
IRJET- Review on Mood Detection using Image Processing and Chatbot using Arti...
IRJET Journal
 
RESUME_HJ Hwang_160909
RESUME_HJ Hwang_160909RESUME_HJ Hwang_160909
RESUME_HJ Hwang_160909Hee Joon Hwang
 

What's hot (20)

FACE DETECTION AND FEATURE EXTRACTION FOR FACIAL EMOTION DETECTION
FACE DETECTION AND FEATURE EXTRACTION FOR FACIAL EMOTION DETECTIONFACE DETECTION AND FEATURE EXTRACTION FOR FACIAL EMOTION DETECTION
FACE DETECTION AND FEATURE EXTRACTION FOR FACIAL EMOTION DETECTION
 
Speaker independent visual lip activity detection for human - computer inte...
Speaker   independent visual lip activity detection for human - computer inte...Speaker   independent visual lip activity detection for human - computer inte...
Speaker independent visual lip activity detection for human - computer inte...
 
A prior case study of natural language processing on different domain
A prior case study of natural language processing  on different domain A prior case study of natural language processing  on different domain
A prior case study of natural language processing on different domain
 
Project Report, Design Project 2 - ICT Intervention for Improvisation of Mate...
Project Report, Design Project 2 - ICT Intervention for Improvisation of Mate...Project Report, Design Project 2 - ICT Intervention for Improvisation of Mate...
Project Report, Design Project 2 - ICT Intervention for Improvisation of Mate...
 
PRE-PROCESSING TECHNIQUES FOR FACIAL EMOTION RECOGNITION SYSTEM
PRE-PROCESSING TECHNIQUES FOR FACIAL EMOTION RECOGNITION SYSTEMPRE-PROCESSING TECHNIQUES FOR FACIAL EMOTION RECOGNITION SYSTEM
PRE-PROCESSING TECHNIQUES FOR FACIAL EMOTION RECOGNITION SYSTEM
 
Constructed model for micro-content recognition in lip reading based deep lea...
Constructed model for micro-content recognition in lip reading based deep lea...Constructed model for micro-content recognition in lip reading based deep lea...
Constructed model for micro-content recognition in lip reading based deep lea...
 
IRJET - Automatic Emotion Recognition using Facial Expression: An Overview
IRJET - Automatic Emotion Recognition using Facial Expression: An OverviewIRJET - Automatic Emotion Recognition using Facial Expression: An Overview
IRJET - Automatic Emotion Recognition using Facial Expression: An Overview
 
Quantitative Analysis of Infant’s Computer-supported Sketch and Design of Dra...
Quantitative Analysis of Infant’s Computer-supported Sketch and Design of Dra...Quantitative Analysis of Infant’s Computer-supported Sketch and Design of Dra...
Quantitative Analysis of Infant’s Computer-supported Sketch and Design of Dra...
 
Real Time Facial Emotion Recognition using Kinect V2 Sensor
Real Time Facial Emotion Recognition using Kinect V2 SensorReal Time Facial Emotion Recognition using Kinect V2 Sensor
Real Time Facial Emotion Recognition using Kinect V2 Sensor
 
IRJET - Automatic Lip Reading: Classification of Words and Phrases using Conv...
IRJET - Automatic Lip Reading: Classification of Words and Phrases using Conv...IRJET - Automatic Lip Reading: Classification of Words and Phrases using Conv...
IRJET - Automatic Lip Reading: Classification of Words and Phrases using Conv...
 
Paper id 24201475
Paper id 24201475Paper id 24201475
Paper id 24201475
 
IRJET - E-Assistant: An Interactive Bot for Banking Sector using NLP Process
IRJET -  	  E-Assistant: An Interactive Bot for Banking Sector using NLP ProcessIRJET -  	  E-Assistant: An Interactive Bot for Banking Sector using NLP Process
IRJET - E-Assistant: An Interactive Bot for Banking Sector using NLP Process
 
IRJET- Sentimental Analysis on Audio and Video
IRJET- Sentimental Analysis on Audio and VideoIRJET- Sentimental Analysis on Audio and Video
IRJET- Sentimental Analysis on Audio and Video
 
A Comprehensive Survey on Human Facial Expression Detection
A Comprehensive Survey on Human Facial Expression DetectionA Comprehensive Survey on Human Facial Expression Detection
A Comprehensive Survey on Human Facial Expression Detection
 
A new alley in Opinion Mining using Senti Audio Visual Algorithm
A new alley in Opinion Mining using Senti Audio Visual AlgorithmA new alley in Opinion Mining using Senti Audio Visual Algorithm
A new alley in Opinion Mining using Senti Audio Visual Algorithm
 
Snibbles - Project Report
Snibbles - Project ReportSnibbles - Project Report
Snibbles - Project Report
 
Deep learning based facial expressions recognition system for assisting visua...
Deep learning based facial expressions recognition system for assisting visua...Deep learning based facial expressions recognition system for assisting visua...
Deep learning based facial expressions recognition system for assisting visua...
 
Final Year Project - Enhancing Virtual Learning through Emotional Agents (Doc...
Final Year Project - Enhancing Virtual Learning through Emotional Agents (Doc...Final Year Project - Enhancing Virtual Learning through Emotional Agents (Doc...
Final Year Project - Enhancing Virtual Learning through Emotional Agents (Doc...
 
IRJET- Review on Mood Detection using Image Processing and Chatbot using Arti...
IRJET- Review on Mood Detection using Image Processing and Chatbot using Arti...IRJET- Review on Mood Detection using Image Processing and Chatbot using Arti...
IRJET- Review on Mood Detection using Image Processing and Chatbot using Arti...
 
RESUME_HJ Hwang_160909
RESUME_HJ Hwang_160909RESUME_HJ Hwang_160909
RESUME_HJ Hwang_160909
 

Similar to Analysis on techniques used to recognize and identifying the Human emotions

Emotion Recognition using Image Processing
Emotion Recognition using Image ProcessingEmotion Recognition using Image Processing
Emotion Recognition using Image Processing
ijtsrd
 
IRJET - Survey on Different Approaches of Depression Analysis
IRJET - Survey on Different Approaches of Depression AnalysisIRJET - Survey on Different Approaches of Depression Analysis
IRJET - Survey on Different Approaches of Depression Analysis
IRJET Journal
 
USER EXPERIENCE AND DIGITALLY TRANSFORMED/CONVERTED EMOTIONS
USER EXPERIENCE AND DIGITALLY TRANSFORMED/CONVERTED EMOTIONSUSER EXPERIENCE AND DIGITALLY TRANSFORMED/CONVERTED EMOTIONS
USER EXPERIENCE AND DIGITALLY TRANSFORMED/CONVERTED EMOTIONS
IJMIT JOURNAL
 
Facial Expression Recognition System: A Digital Printing Application
Facial Expression Recognition System: A Digital Printing ApplicationFacial Expression Recognition System: A Digital Printing Application
Facial Expression Recognition System: A Digital Printing Application
ijceronline
 
Facial Expression Recognition System: A Digital Printing Application
Facial Expression Recognition System: A Digital Printing ApplicationFacial Expression Recognition System: A Digital Printing Application
Facial Expression Recognition System: A Digital Printing Application
ijceronline
 
Two Level Decision for Recognition of Human Facial Expressions using Neural N...
Two Level Decision for Recognition of Human Facial Expressions using Neural N...Two Level Decision for Recognition of Human Facial Expressions using Neural N...
Two Level Decision for Recognition of Human Facial Expressions using Neural N...
IIRindia
 
DETECTING FACIAL EXPRESSION IN IMAGES
DETECTING FACIAL EXPRESSION IN IMAGESDETECTING FACIAL EXPRESSION IN IMAGES
DETECTING FACIAL EXPRESSION IN IMAGES
Journal For Research
 
Facial emotion recognition using enhanced multi-verse optimizer method
Facial emotion recognition using enhanced multi-verse optimizer methodFacial emotion recognition using enhanced multi-verse optimizer method
Facial emotion recognition using enhanced multi-verse optimizer method
IJECEIAES
 
Facial Emoji Recognition
Facial Emoji RecognitionFacial Emoji Recognition
Facial Emoji Recognition
ijtsrd
 
A Study on Face Expression Observation Systems
A Study on Face Expression Observation SystemsA Study on Face Expression Observation Systems
A Study on Face Expression Observation Systems
ijtsrd
 
A Review On Sentiment Analysis And Emotion Detection From Text
A Review On Sentiment Analysis And Emotion Detection From TextA Review On Sentiment Analysis And Emotion Detection From Text
A Review On Sentiment Analysis And Emotion Detection From Text
Yasmine Anino
 
International Journal of Biometrics and Bioinformatics(IJBB) Volume (2) Issue...
International Journal of Biometrics and Bioinformatics(IJBB) Volume (2) Issue...International Journal of Biometrics and Bioinformatics(IJBB) Volume (2) Issue...
International Journal of Biometrics and Bioinformatics(IJBB) Volume (2) Issue...CSCJournals
 
76201926
7620192676201926
76201926
IJRAT
 
Literature Review On: ”Speech Emotion Recognition Using Deep Neural Network”
Literature Review On: ”Speech Emotion Recognition Using Deep Neural Network”Literature Review On: ”Speech Emotion Recognition Using Deep Neural Network”
Literature Review On: ”Speech Emotion Recognition Using Deep Neural Network”
IRJET Journal
 
IRJET - A Survey on Human Facial Expression Recognition Techniques
IRJET - A Survey on Human Facial Expression Recognition TechniquesIRJET - A Survey on Human Facial Expression Recognition Techniques
IRJET - A Survey on Human Facial Expression Recognition Techniques
IRJET Journal
 
K017326168
K017326168K017326168
K017326168
IOSR Journals
 
Efficient Facial Expression and Face Recognition using Ranking Method
Efficient Facial Expression and Face Recognition using Ranking MethodEfficient Facial Expression and Face Recognition using Ranking Method
Efficient Facial Expression and Face Recognition using Ranking Method
IJERA Editor
 
Mental Health Monitor using facial expression
Mental Health Monitor using facial expressionMental Health Monitor using facial expression
Mental Health Monitor using facial expression
hightech12
 
Review of facial expression recognition system and used datasets
Review of facial expression recognition system and used datasetsReview of facial expression recognition system and used datasets
Review of facial expression recognition system and used datasets
eSAT Journals
 
Review of facial expression recognition system and
Review of facial expression recognition system andReview of facial expression recognition system and
Review of facial expression recognition system and
eSAT Publishing House
 

Similar to Analysis on techniques used to recognize and identifying the Human emotions (20)

Emotion Recognition using Image Processing
Emotion Recognition using Image ProcessingEmotion Recognition using Image Processing
Emotion Recognition using Image Processing
 
IRJET - Survey on Different Approaches of Depression Analysis
IRJET - Survey on Different Approaches of Depression AnalysisIRJET - Survey on Different Approaches of Depression Analysis
IRJET - Survey on Different Approaches of Depression Analysis
 
USER EXPERIENCE AND DIGITALLY TRANSFORMED/CONVERTED EMOTIONS
USER EXPERIENCE AND DIGITALLY TRANSFORMED/CONVERTED EMOTIONSUSER EXPERIENCE AND DIGITALLY TRANSFORMED/CONVERTED EMOTIONS
USER EXPERIENCE AND DIGITALLY TRANSFORMED/CONVERTED EMOTIONS
 
Facial Expression Recognition System: A Digital Printing Application
Facial Expression Recognition System: A Digital Printing ApplicationFacial Expression Recognition System: A Digital Printing Application
Facial Expression Recognition System: A Digital Printing Application
 
Facial Expression Recognition System: A Digital Printing Application
Facial Expression Recognition System: A Digital Printing ApplicationFacial Expression Recognition System: A Digital Printing Application
Facial Expression Recognition System: A Digital Printing Application
 
Two Level Decision for Recognition of Human Facial Expressions using Neural N...
Two Level Decision for Recognition of Human Facial Expressions using Neural N...Two Level Decision for Recognition of Human Facial Expressions using Neural N...
Two Level Decision for Recognition of Human Facial Expressions using Neural N...
 
DETECTING FACIAL EXPRESSION IN IMAGES
DETECTING FACIAL EXPRESSION IN IMAGESDETECTING FACIAL EXPRESSION IN IMAGES
DETECTING FACIAL EXPRESSION IN IMAGES
 
Facial emotion recognition using enhanced multi-verse optimizer method
Facial emotion recognition using enhanced multi-verse optimizer methodFacial emotion recognition using enhanced multi-verse optimizer method
Facial emotion recognition using enhanced multi-verse optimizer method
 
Facial Emoji Recognition
Facial Emoji RecognitionFacial Emoji Recognition
Facial Emoji Recognition
 
A Study on Face Expression Observation Systems
A Study on Face Expression Observation SystemsA Study on Face Expression Observation Systems
A Study on Face Expression Observation Systems
 
A Review On Sentiment Analysis And Emotion Detection From Text
A Review On Sentiment Analysis And Emotion Detection From TextA Review On Sentiment Analysis And Emotion Detection From Text
A Review On Sentiment Analysis And Emotion Detection From Text
 
International Journal of Biometrics and Bioinformatics(IJBB) Volume (2) Issue...
International Journal of Biometrics and Bioinformatics(IJBB) Volume (2) Issue...International Journal of Biometrics and Bioinformatics(IJBB) Volume (2) Issue...
International Journal of Biometrics and Bioinformatics(IJBB) Volume (2) Issue...
 
76201926
7620192676201926
76201926
 
Literature Review On: ”Speech Emotion Recognition Using Deep Neural Network”
Literature Review On: ”Speech Emotion Recognition Using Deep Neural Network”Literature Review On: ”Speech Emotion Recognition Using Deep Neural Network”
Literature Review On: ”Speech Emotion Recognition Using Deep Neural Network”
 
IRJET - A Survey on Human Facial Expression Recognition Techniques
IRJET - A Survey on Human Facial Expression Recognition TechniquesIRJET - A Survey on Human Facial Expression Recognition Techniques
IRJET - A Survey on Human Facial Expression Recognition Techniques
 
K017326168
K017326168K017326168
K017326168
 
Efficient Facial Expression and Face Recognition using Ranking Method
Efficient Facial Expression and Face Recognition using Ranking MethodEfficient Facial Expression and Face Recognition using Ranking Method
Efficient Facial Expression and Face Recognition using Ranking Method
 
Mental Health Monitor using facial expression
Mental Health Monitor using facial expressionMental Health Monitor using facial expression
Mental Health Monitor using facial expression
 
Review of facial expression recognition system and used datasets
Review of facial expression recognition system and used datasetsReview of facial expression recognition system and used datasets
Review of facial expression recognition system and used datasets
 
Review of facial expression recognition system and
Review of facial expression recognition system andReview of facial expression recognition system and
Review of facial expression recognition system and
 

More from IJECEIAES

Bibliometric analysis highlighting the role of women in addressing climate ch...
Bibliometric analysis highlighting the role of women in addressing climate ch...Bibliometric analysis highlighting the role of women in addressing climate ch...
Bibliometric analysis highlighting the role of women in addressing climate ch...
IJECEIAES
 
Voltage and frequency control of microgrid in presence of micro-turbine inter...
Voltage and frequency control of microgrid in presence of micro-turbine inter...Voltage and frequency control of microgrid in presence of micro-turbine inter...
Voltage and frequency control of microgrid in presence of micro-turbine inter...
IJECEIAES
 
Enhancing battery system identification: nonlinear autoregressive modeling fo...
Enhancing battery system identification: nonlinear autoregressive modeling fo...Enhancing battery system identification: nonlinear autoregressive modeling fo...
Enhancing battery system identification: nonlinear autoregressive modeling fo...
IJECEIAES
 
Smart grid deployment: from a bibliometric analysis to a survey
Smart grid deployment: from a bibliometric analysis to a surveySmart grid deployment: from a bibliometric analysis to a survey
Smart grid deployment: from a bibliometric analysis to a survey
IJECEIAES
 
Use of analytical hierarchy process for selecting and prioritizing islanding ...
Use of analytical hierarchy process for selecting and prioritizing islanding ...Use of analytical hierarchy process for selecting and prioritizing islanding ...
Use of analytical hierarchy process for selecting and prioritizing islanding ...
IJECEIAES
 
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...
IJECEIAES
 
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...
IJECEIAES
 
Adaptive synchronous sliding control for a robot manipulator based on neural ...
Adaptive synchronous sliding control for a robot manipulator based on neural ...Adaptive synchronous sliding control for a robot manipulator based on neural ...
Adaptive synchronous sliding control for a robot manipulator based on neural ...
IJECEIAES
 
Remote field-programmable gate array laboratory for signal acquisition and de...
Remote field-programmable gate array laboratory for signal acquisition and de...Remote field-programmable gate array laboratory for signal acquisition and de...
Remote field-programmable gate array laboratory for signal acquisition and de...
IJECEIAES
 
Detecting and resolving feature envy through automated machine learning and m...
Detecting and resolving feature envy through automated machine learning and m...Detecting and resolving feature envy through automated machine learning and m...
Detecting and resolving feature envy through automated machine learning and m...
IJECEIAES
 
Smart monitoring technique for solar cell systems using internet of things ba...
Smart monitoring technique for solar cell systems using internet of things ba...Smart monitoring technique for solar cell systems using internet of things ba...
Smart monitoring technique for solar cell systems using internet of things ba...
IJECEIAES
 
An efficient security framework for intrusion detection and prevention in int...
An efficient security framework for intrusion detection and prevention in int...An efficient security framework for intrusion detection and prevention in int...
An efficient security framework for intrusion detection and prevention in int...
IJECEIAES
 
Developing a smart system for infant incubators using the internet of things ...
Developing a smart system for infant incubators using the internet of things ...Developing a smart system for infant incubators using the internet of things ...
Developing a smart system for infant incubators using the internet of things ...
IJECEIAES
 
A review on internet of things-based stingless bee's honey production with im...
A review on internet of things-based stingless bee's honey production with im...A review on internet of things-based stingless bee's honey production with im...
A review on internet of things-based stingless bee's honey production with im...
IJECEIAES
 
A trust based secure access control using authentication mechanism for intero...
A trust based secure access control using authentication mechanism for intero...A trust based secure access control using authentication mechanism for intero...
A trust based secure access control using authentication mechanism for intero...
IJECEIAES
 
Fuzzy linear programming with the intuitionistic polygonal fuzzy numbers
Fuzzy linear programming with the intuitionistic polygonal fuzzy numbersFuzzy linear programming with the intuitionistic polygonal fuzzy numbers
Fuzzy linear programming with the intuitionistic polygonal fuzzy numbers
IJECEIAES
 
The performance of artificial intelligence in prostate magnetic resonance im...
The performance of artificial intelligence in prostate  magnetic resonance im...The performance of artificial intelligence in prostate  magnetic resonance im...
The performance of artificial intelligence in prostate magnetic resonance im...
IJECEIAES
 
Seizure stage detection of epileptic seizure using convolutional neural networks
Seizure stage detection of epileptic seizure using convolutional neural networksSeizure stage detection of epileptic seizure using convolutional neural networks
Seizure stage detection of epileptic seizure using convolutional neural networks
IJECEIAES
 
Analysis of driving style using self-organizing maps to analyze driver behavior
Analysis of driving style using self-organizing maps to analyze driver behaviorAnalysis of driving style using self-organizing maps to analyze driver behavior
Analysis of driving style using self-organizing maps to analyze driver behavior
IJECEIAES
 
Hyperspectral object classification using hybrid spectral-spatial fusion and ...
Hyperspectral object classification using hybrid spectral-spatial fusion and ...Hyperspectral object classification using hybrid spectral-spatial fusion and ...
Hyperspectral object classification using hybrid spectral-spatial fusion and ...
IJECEIAES
 

More from IJECEIAES (20)

Bibliometric analysis highlighting the role of women in addressing climate ch...
Bibliometric analysis highlighting the role of women in addressing climate ch...Bibliometric analysis highlighting the role of women in addressing climate ch...
Bibliometric analysis highlighting the role of women in addressing climate ch...
 
Voltage and frequency control of microgrid in presence of micro-turbine inter...
Voltage and frequency control of microgrid in presence of micro-turbine inter...Voltage and frequency control of microgrid in presence of micro-turbine inter...
Voltage and frequency control of microgrid in presence of micro-turbine inter...
 
Enhancing battery system identification: nonlinear autoregressive modeling fo...
Enhancing battery system identification: nonlinear autoregressive modeling fo...Enhancing battery system identification: nonlinear autoregressive modeling fo...
Enhancing battery system identification: nonlinear autoregressive modeling fo...
 
Smart grid deployment: from a bibliometric analysis to a survey
Smart grid deployment: from a bibliometric analysis to a surveySmart grid deployment: from a bibliometric analysis to a survey
Smart grid deployment: from a bibliometric analysis to a survey
 
Use of analytical hierarchy process for selecting and prioritizing islanding ...
Use of analytical hierarchy process for selecting and prioritizing islanding ...Use of analytical hierarchy process for selecting and prioritizing islanding ...
Use of analytical hierarchy process for selecting and prioritizing islanding ...
 
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...
 
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...
 
Adaptive synchronous sliding control for a robot manipulator based on neural ...
Adaptive synchronous sliding control for a robot manipulator based on neural ...Adaptive synchronous sliding control for a robot manipulator based on neural ...
Adaptive synchronous sliding control for a robot manipulator based on neural ...
 
Remote field-programmable gate array laboratory for signal acquisition and de...
Remote field-programmable gate array laboratory for signal acquisition and de...Remote field-programmable gate array laboratory for signal acquisition and de...
Remote field-programmable gate array laboratory for signal acquisition and de...
 
Detecting and resolving feature envy through automated machine learning and m...
Detecting and resolving feature envy through automated machine learning and m...Detecting and resolving feature envy through automated machine learning and m...
Detecting and resolving feature envy through automated machine learning and m...
 
Smart monitoring technique for solar cell systems using internet of things ba...
Smart monitoring technique for solar cell systems using internet of things ba...Smart monitoring technique for solar cell systems using internet of things ba...
Smart monitoring technique for solar cell systems using internet of things ba...
 
An efficient security framework for intrusion detection and prevention in int...
An efficient security framework for intrusion detection and prevention in int...An efficient security framework for intrusion detection and prevention in int...
An efficient security framework for intrusion detection and prevention in int...
 
Developing a smart system for infant incubators using the internet of things ...
Developing a smart system for infant incubators using the internet of things ...Developing a smart system for infant incubators using the internet of things ...
Developing a smart system for infant incubators using the internet of things ...
 
A review on internet of things-based stingless bee's honey production with im...
A review on internet of things-based stingless bee's honey production with im...A review on internet of things-based stingless bee's honey production with im...
A review on internet of things-based stingless bee's honey production with im...
 
A trust based secure access control using authentication mechanism for intero...
A trust based secure access control using authentication mechanism for intero...A trust based secure access control using authentication mechanism for intero...
A trust based secure access control using authentication mechanism for intero...
 
Fuzzy linear programming with the intuitionistic polygonal fuzzy numbers
Fuzzy linear programming with the intuitionistic polygonal fuzzy numbersFuzzy linear programming with the intuitionistic polygonal fuzzy numbers
Fuzzy linear programming with the intuitionistic polygonal fuzzy numbers
 
The performance of artificial intelligence in prostate magnetic resonance im...
The performance of artificial intelligence in prostate  magnetic resonance im...The performance of artificial intelligence in prostate  magnetic resonance im...
The performance of artificial intelligence in prostate magnetic resonance im...
 
Seizure stage detection of epileptic seizure using convolutional neural networks
Seizure stage detection of epileptic seizure using convolutional neural networksSeizure stage detection of epileptic seizure using convolutional neural networks
Seizure stage detection of epileptic seizure using convolutional neural networks
 
Analysis of driving style using self-organizing maps to analyze driver behavior
Analysis of driving style using self-organizing maps to analyze driver behaviorAnalysis of driving style using self-organizing maps to analyze driver behavior
Analysis of driving style using self-organizing maps to analyze driver behavior
 
Hyperspectral object classification using hybrid spectral-spatial fusion and ...
Hyperspectral object classification using hybrid spectral-spatial fusion and ...Hyperspectral object classification using hybrid spectral-spatial fusion and ...
Hyperspectral object classification using hybrid spectral-spatial fusion and ...
 

Recently uploaded

CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptx
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptxCFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptx
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptx
R&R Consult
 
WATER CRISIS and its solutions-pptx 1234
WATER CRISIS and its solutions-pptx 1234WATER CRISIS and its solutions-pptx 1234
WATER CRISIS and its solutions-pptx 1234
AafreenAbuthahir2
 
Architectural Portfolio Sean Lockwood
Architectural Portfolio Sean LockwoodArchitectural Portfolio Sean Lockwood
Architectural Portfolio Sean Lockwood
seandesed
 
J.Yang, ICLR 2024, MLILAB, KAIST AI.pdf
J.Yang,  ICLR 2024, MLILAB, KAIST AI.pdfJ.Yang,  ICLR 2024, MLILAB, KAIST AI.pdf
J.Yang, ICLR 2024, MLILAB, KAIST AI.pdf
MLILAB
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
Neometrix_Engineering_Pvt_Ltd
 
Final project report on grocery store management system..pdf
Final project report on grocery store management system..pdfFinal project report on grocery store management system..pdf
Final project report on grocery store management system..pdf
Kamal Acharya
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
MdTanvirMahtab2
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generation
Robbie Edward Sayers
 
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdfWater Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation & Control
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
SamSarthak3
 
LIGA(E)11111111111111111111111111111111111111111.ppt
LIGA(E)11111111111111111111111111111111111111111.pptLIGA(E)11111111111111111111111111111111111111111.ppt
LIGA(E)11111111111111111111111111111111111111111.ppt
ssuser9bd3ba
 
The role of big data in decision making.
The role of big data in decision making.The role of big data in decision making.
The role of big data in decision making.
ankuprajapati0525
 
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Dr.Costas Sachpazis
 
Forklift Classes Overview by Intella Parts
Forklift Classes Overview by Intella PartsForklift Classes Overview by Intella Parts
Forklift Classes Overview by Intella Parts
Intella Parts
 
ASME IX(9) 2007 Full Version .pdf
ASME IX(9)  2007 Full Version       .pdfASME IX(9)  2007 Full Version       .pdf
ASME IX(9) 2007 Full Version .pdf
AhmedHussein950959
 
Railway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdfRailway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdf
TeeVichai
 
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
H.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdfH.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdf
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
MLILAB
 
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
Amil Baba Dawood bangali
 
Quality defects in TMT Bars, Possible causes and Potential Solutions.
Quality defects in TMT Bars, Possible causes and Potential Solutions.Quality defects in TMT Bars, Possible causes and Potential Solutions.
Quality defects in TMT Bars, Possible causes and Potential Solutions.
PrashantGoswami42
 
power quality voltage fluctuation UNIT - I.pptx
power quality voltage fluctuation UNIT - I.pptxpower quality voltage fluctuation UNIT - I.pptx
power quality voltage fluctuation UNIT - I.pptx
ViniHema
 

Recently uploaded (20)

CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptx
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptxCFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptx
CFD Simulation of By-pass Flow in a HRSG module by R&R Consult.pptx
 
WATER CRISIS and its solutions-pptx 1234
WATER CRISIS and its solutions-pptx 1234WATER CRISIS and its solutions-pptx 1234
WATER CRISIS and its solutions-pptx 1234
 
Architectural Portfolio Sean Lockwood
Architectural Portfolio Sean LockwoodArchitectural Portfolio Sean Lockwood
Architectural Portfolio Sean Lockwood
 
J.Yang, ICLR 2024, MLILAB, KAIST AI.pdf
J.Yang,  ICLR 2024, MLILAB, KAIST AI.pdfJ.Yang,  ICLR 2024, MLILAB, KAIST AI.pdf
J.Yang, ICLR 2024, MLILAB, KAIST AI.pdf
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
 
Final project report on grocery store management system..pdf
Final project report on grocery store management system..pdfFinal project report on grocery store management system..pdf
Final project report on grocery store management system..pdf
 
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
Industrial Training at Shahjalal Fertilizer Company Limited (SFCL)
 
HYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generationHYDROPOWER - Hydroelectric power generation
HYDROPOWER - Hydroelectric power generation
 
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdfWater Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdf
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
 
LIGA(E)11111111111111111111111111111111111111111.ppt
LIGA(E)11111111111111111111111111111111111111111.pptLIGA(E)11111111111111111111111111111111111111111.ppt
LIGA(E)11111111111111111111111111111111111111111.ppt
 
The role of big data in decision making.
The role of big data in decision making.The role of big data in decision making.
The role of big data in decision making.
 
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
Sachpazis:Terzaghi Bearing Capacity Estimation in simple terms with Calculati...
 
Forklift Classes Overview by Intella Parts
Forklift Classes Overview by Intella PartsForklift Classes Overview by Intella Parts
Forklift Classes Overview by Intella Parts
 
ASME IX(9) 2007 Full Version .pdf
ASME IX(9)  2007 Full Version       .pdfASME IX(9)  2007 Full Version       .pdf
ASME IX(9) 2007 Full Version .pdf
 
Railway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdfRailway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdf
 
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
H.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdfH.Seo,  ICLR 2024, MLILAB,  KAIST AI.pdf
H.Seo, ICLR 2024, MLILAB, KAIST AI.pdf
 
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
 
Quality defects in TMT Bars, Possible causes and Potential Solutions.
Quality defects in TMT Bars, Possible causes and Potential Solutions.Quality defects in TMT Bars, Possible causes and Potential Solutions.
Quality defects in TMT Bars, Possible causes and Potential Solutions.
 
power quality voltage fluctuation UNIT - I.pptx
power quality voltage fluctuation UNIT - I.pptxpower quality voltage fluctuation UNIT - I.pptx
power quality voltage fluctuation UNIT - I.pptx
 

Analysis on techniques used to recognize and identifying the Human emotions

  • 1. International Journal of Electrical and Computer Engineering (IJECE) Vol. 10, No. 3, June 2020, pp. 3307~3314 ISSN: 2088-8708, DOI: 10.11591/ijece.v10i3.pp3307-3314  3307 Journal homepage: http://ijece.iaescore.com/index.php/IJECE Analysis on techniques used to recognize and identifying the Human emotions Praveen Kulkarni1 , Rajesh T M2 1 Research Scholar at Dayananda Sagar University, India 1 Department of Computer Science and Engineering, Faculty of Engineering, CHRIST (Deemed to be university), India 2 Department of Computer Science and Engineering, Dayananda Sagar University, India Article Info ABSTRACT Article history: Received Jul 26, 2019 Revised Dec 17, 2019 Accepted Jan 8, 2020 Facial expression is a major area for non-verbal language in day to day life communication. As the statistical analysis shows only 7 percent of the message in communication was covered in verbal communication while 55 percent transmitted by facial expression. Emotional expression has been a research subject of physiology since Darwin’s work on emotional expression in the 19th century. According to Psychological theory the classification of human emotion is classified majorly into six emotions: happiness, fear, anger, surprise, disgust, and sadness. Facial expressions which involve the emotions and the nature of speech play a foremost role in expressing these emotions. Thereafter, researchers developed a system based on Anatomic of face named Facial Action Coding System (FACS) in 1970. Ever since the development of FACS there is a rapid progress in the domain of emotion recognition. This work is intended to give a thorough comparative analysis of the various techniques and methods that were applied to recognize and identify human emotions. This analysis results will help to identify proper and suitable techniques, algorithms and the methodologies for future research directions. In this paper extensive analysis on various recognition techniques used to identify the complexity in recognizing the facial expression is presented. Keywords: Classification Face detection Feature extraction Copyright © 2020 Institute of Advanced Engineering and Science. All rights reserved. Corresponding Author: Praveen Kulkarni, Department of Computer Science and Engineering, Faculty of Engineering, CHRIST (Deemed to be University), Bangalore, India. Email: praveencs024@gmail.com 1. INTRODUCTION Emotions play a very important function in many fields likes forensic crime detection, psychologically affected patients, students mentoring in academics and victim observation in court, etc. However, so much of the growth happened in this technological domain, still, we can find lots of drawbacks and loopholes in terms of accuracy in various results we found in our survey work. There are a number of jobs performed by individuals and groups. The goal of facial expression is to identify oneself by observing a single image or multiple images, which is the emotion that the image shows. The human face has several components such as eye, nose, mouth, Brow and a few others. Based on the movement of those components and change of shape and sizes emotions may be extracted in various ways. Reorganization of emotion in human has become a greatest challenge faced in the interaction of computer and humans. Most of the effort on emotion recognition focuses on information extracted from visual or audio separately. Numerous surveys have been published in different areas, analyzing and recognizing the gestures and movements of human face and tracking of eye is still an unsolved problem with respect to accuracy rate. Many researchers in this area are been set as a benchmark and basement for many of the current research topics. Researcher in his work [1]
  • 2.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3307 - 3314 3308 has dedicated his career to the pursuit of emotions, focusing mainly on six different emotions surprise, sadness, fear, anger, disgust, happiness. However, in the year 1990s, he dilated his list of basic emotions, together with a variety of positive and negative emotions including amusement, Dislike, awkwardness, relief, pleasure, satisfaction, guilt and pride Understanding facial expression can be seen as a common phenomenon but has become the most essential indication in the history of emotional psychology. More recently, research on facial expressions has led to the emergence of new concepts, new techniques, and innovative results. In this case, scientists are formulating alternative accounts, while previous versions are acquiring new interests. Darwin’s arguments suggest that emotions have evolved to the point of acting as an instrument of communication between individuals. In the year 2003 author Gao et L used dLHD and ID3 decision tree classification method (Noh et al in 2007). Zhao and Pietikainen in 2009 used SVM and Song et al in 2010 used SVM for the classification method. The researcher in his work specified that SVM classifier gives higest accuracy for several facial expressions [2]. The unsupervised learning algorithm called LVQ-Learning Vector Quantization (Bashyal in 2008) and MLP is also used for classification [3]. KNN algorithm (Poursaberi in 2012) is a method of classification through which relationship among the assessment models are estimated during training. HMM classifier be the one of the statistical representation for categorize expression into various types in face detection (Taylor in 2014). Classification and regression tree is machine learning algorithm (Salman in 2016) used for classification using distance vectors. CNN (2016). MFFNN, DNN (2015), MDC (Islam in 2018) are some of the classifiers used recently. With respect to many classifiers SVM gives improved reorganization accurateness and also better classification. Compared to the past algorithm used for classification presently the SVM classifier is greatly use classifier. Other classifiers like CART, pair wise, Neural Network based are used in modern years compared to past decades Emotion recognition systems realize applications in several fascinating areas. With the recent advances in artificial intelligence, significantly automaton robots, the requirement for a strong expression recognition system is obvious. The recognition of emotions plays a very important role within the recognition of one’s own fondness and in turn, helps to form a sensitive and sensitive human-machine interface (HCI). Additionally, the 2 main applications, particularly Artificial Intelligence and sensitive human machine interface, Emotion recognition systems area unit utilized in an outsized variety of alternative domains like telecommunications field, video gaming, video animation, psychiatry, in automotive security, educational computer system and many others. Many algorithms have been suggested to develop systems/applications that can detect emotions very well. Computer applications could communicate better by changing responses based on the emotional state of human users in various interactions Emotion detection using the facial components suggested before is still being used. It was mainly observed that for emotion detection we need to do it by stage by stage. Preprocessing is the first stage, and then comes the feature extraction and then the classification. As the year passed there has been lot of growth and advancement in these three stages with different variety of algorithms. In the present observations more number of features are extracted from the faces to verify the emotions.Facial emotion recognition field is gaining a ton of attention and in past two decades with applications and modernization not solely within the sensory activity and psychological feature sciences, however additionally in emotional computing and computer animations. A much more modern study suggests that there are far more basic emotions than antecedently believed. Within the study revealed in Proceedings of National Academy of Sciences, researchers known 27 totally different classes of emotion. Instead of being entirely distinct, however, the researchers found that individual’s expertise these emotions on a gradient. Recognition in the form of facial emotion and expression has additionally been increased. Recently because of fast development of machine learning and artificial intelligent (AI) techniques, as well as the human computer interaction, a computer game (VR), augment reality (AR). Detection of facial expression in advanced driver assistant systems and recreation is also found. Although the technology for recognizing emotions is important and has evolved in different fields, it is still the unanswered problem. Detecting the feeling of being human can be achieved through the use of facial images, voice, and body shape. With this detection, the image of the face is the most frequent source and to detect emotions above the entire frontal facial image are commonly used to detect emotions. The procedure for recognizing emotions is not simple but complex because it extracts the appropriate characteristics and Detecting emotions requires complex steps.The Facial Expression Recognition (FER) consists of five phases as shown in Figure 1. Noise elimination/improvement is performed in the pre- processing phase take an image or a sequence of images (a time series of images from neutral to an expression) as an input face for further processing. Detection of facial components detects return on investment in eyes, nose, cheeks, mouth, eyes, forehead, ear, front head, etc. The characteristic extraction phase concerns the extraction of the characteristics from the ROI. Here, we discuss the work not included in previous surveys, which has changed in the last two decades in the detection of emotions. The topics discussed in the following sections are on the comparison between previously done work and the present research being carried out.
  • 3. Int J Elec & Comp Eng ISSN: 2088-8708  Analysis on techniques used to recognize and identifying the Human emotions (Praveen Kulkarni) 3309 Figure 1. Phases of facial expression recognition 2. RESEARCH METHOD Pre-treatment will be the first step in detecting a face. To get purely facial images with good Intensity, proper size, and identical normalized shape we have to conduct pre-processing. To convert image into a clean image of the normalize face for taking out the characteristic is the detection of characteristic points, which rotate to get in line, identify and cut the facial region by means of a four-sided figure, according to the copy of the face. A single image in the Detector faces can be categorized into four methods based on Feature-based, Appearance-based, Knowledge-based and template based as shown in Figure 2. Figure 2. Face detection methods 2.1. Feature based Feature based: This technique is used to find faces by extracting structural options within the face. Initially it will be trained as classifier and so to differentiate between which is facial and non-facial region. Concept is our self-generated information of faces is to beat the bonds. This approach divided into many steps and even photos with several faces they report a success rate of 94%. 2.2. Texture-based Texture-Based: The recognition of the emotion is done on the plot characteristic extract from the gray-level coexistence matrix, GLCM characteristics are extracted and Format with the support machine carriers by diverse cores. Statistical and features that are taken from GLPM are given as the input for classification to SVM. The detection rate is identified as 90% [4]. The algorithm detects the characteristic points of a victimization abstraction filter technique by the data given by author [5]. The group indicates the victimization of the facial candidate. Initially it should be trained such that the classifier can identify the facial region and non-facial region. This technique is based on an 85% detection rate with one hundred images with different scales and direction of purpose. 2.3. Skin color based Skin color Based: As expressed by author in [6], algorithms are compared using 3 colors. This mixture of color leads to face detection algorithm with color replacement. Once the skin region is obtained, the expression of the eyes is eliminated by obscuring the background colors and the image is reshaped in grayscale and the binary image as an acceptable limit. The rate of detection is accurate to 95.18.
  • 4.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3307 - 3314 3310 2.4. Multiple feature Multiple Feature: Adaboost has selected the multiple features of the face region algorithm [7]. The face is divided into different regions by different orthogonal regions. Components like nose, eyes, mouth are identified for the purpose of analysis. Area where the combination was used and given as input for the classifier. In this phase it will choose one of the best arrangements and then modify the weight in the coming process. Detection routine in many characters is great than the main component of the common orthogonal part analysis method. 2.5. Template matching methods Template matching methods: when the image is given as an input this method will match it with the defined template and identify the face. For example, we can divide the face into many parts, like eyes, nose, mouth. Also, we can design a model in which can be helpful for the edge detection. This approach is easy to implement, however, it is not sufficient for detecting of face. However, shape of templates is planned and managed in these issues. 2.6. Appearance-based methods Appearance-based methods: Designed mainly for face detection. The appearance-based technique depends on the number of people authorized coaching face pictures to search out face models. This type of approach is best compared to alternative when we think of performance. Generally, the appearance-based technique has faith in techniques from applied mathematics analysis and machine learning to search out the relevant characteristics of face pictures. This technique conjointly employed in feature extraction for face recognition. 3. RESULTS AND DISCUSSIONS Effect of light should be eliminated to scale for a fixed size image [8]. To identifies the face on the image automatically facial points should be detected. Algorithmic scheme proposed [9] with the technique [10] is commonly used for face detection. Algorithmic rule has been tested within the Cohn-Kanade info, also as in technology & Mathematical Modeling [11] info. Detection of fourteen points of facial expression with a median exactness of 86% in Cohn- Kanade info and 83% within the info of mathematical modeling.In Eigen face based algorithm of Yogesh Tayal et.al [12] applies to a mixture of images taken with special illuminations and background, the size of the image and the time needed for the algorithm is 4.5456 seconds. Here we can take Euclidean weight & distance of the input image. With the help of data base comparing, characteristics and calculation of recognizing of face is done. Knowledge based methods-this method mainly depends on rules that are set and it support human information in discovering the face. For example, face has a nose and mouth at a bound distance and same with the nose with one another. As the large drawback with this strategy is that the problem in constructing associate degree acceptable set of rule. Several false positive foundations were very general and careful. This type of approach alone is short andnot capable to search out several faces in numerous pictures. The face, color and shape of the skin adapt to the size of the window and to the color signature to calculate the color distance [13]. A face normally comprises nose, eyes & mouth with exact distance and their position with each other. The major problem in the method comes when rules have to be defined. Tactics reached 93.4% detection with false positives up to 7.1%. 3.1. Template matching Template Matching: Author in his paper [14] says it is based on local, statistical and local characteristics Binary models (LBP) for the recognition of the independent expression of the person. The detection with help of MMI is equivalent to 86.97% and it is 85.6% in the JAFFE database. 3.2. Active shape model Active Shape Model: Author in his paper [15], says the sequence of images is performed frankly Model of wire structure and algorithm for support of vector and tack of active appearance. The machine (SVM) is used for classification. The model deforms the shape for the final frame when the expression changed. 3.3. Distribution features Distribution Features: Author in his paper [16] paper, the images were taken and five significant parts were cut out the image that is made for extraction and, therefore, stores the specific eigenvectors for expressions. The eigenvectors are calculated and input facial image is accepted. SVM is used for classification.
  • 5. Int J Elec & Comp Eng ISSN: 2088-8708  Analysis on techniques used to recognize and identifying the Human emotions (Praveen Kulkarni) 3311 3.4. Feature extraction Feature Extraction: It is a way of recognizing face. Several steps are involved such as Reduction, extraction of facial appearance and collection of characteristics. Dimensional decrease is a significant task in the form of recognition system. Figure 3 shows the different feature extraction with their recognition rate. Discrete Cosine Transform - Way in of image DCT is applied and features such as mouth, eyes are extracted from image [17] and classifier used is Ada boost with the recognition rate of 75.93%. Figure 3. Feature extraction and recognition rate 3.5. Gabor filter Gabor Filter: To separate different expression Gabor filters are used. The database of expressions used to recognize the recognition system was JAFFE and recognition rate is above 93%. 3.6. PCA PCA: Extraction of facial features by analyzing the main components according to the author [18] Analysis of the main weighted components (WPCA) the method is used in merging multiple functions for the classification. The Euclidean distance is calculated to obtain the resemblance involving the models and therefore the recognition of the facial expression is done with the nearest algorithm. The recognition rate is 88.25% LDA: Linear discriminant method: was proposed for calculating discriminating vectors with two stage procedure [19]. Vectors are combined together using the K-means grouping method with each one change sample and 91% recognition was proposed in this classifications: 3.6.1. Hidden markov model Hidden Markov Model: Hidden Markov model was developed, categorizing the higher emotional states as involved, insecure, disagreeing, hopeful and discouraging, starting from the lowest level. Author view is Emotional categorization is well formed to know the state of emotions, so it works as information, so a corresponding assignment is assigned between facial features. Expert Rule is used for segmenting & for recognizing the emotion state of number of video sequences. In this the probabilistic frame of modeling variable time sequence and the union of detection calculation is performed in actual time. Unsafe emotion reorganization is 87 %. 3.6.2. Neural networks Neural Networks: There is an arrangement of 2 Methods, the extraction of features and the neural network. There are two phases in face detection & cataloging. Preprocessing of image is done to reduce the time taken and increase the time Image quality. Here neural network trains with face and not the face pictures as of the Yale Face information. Pictures within the knowledge set area unit 27x18 eighteen pixels the pictures area unit in Grayscale in wrangle forma and its speed of come is 84.4 percent. 3.6.3. Support vector machine Support Vector Machine: Scanning of each video frames in real time for the first time will detect the front face, so the faces are resized in patches of images of the same size together by means of a bench Gabor energy filters. Finally, author developed a system that gives the style of completely different
  • 6.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3307 - 3314 3312 countenance codes at twenty-four frames per second and animates the facial look generated by the computer; it’s additionally the secret writing of absolutely computerized facial action. Therefore, the recognition rate is 93%. 3.6.4. AdaBoost AdaBoost: The front face may be a sequence of pictures classified in seven categories as surprise, anger, fear, disgust neutral, joy, sadness. Here the popular technique is dead while not feature block. The color of the skin is detected during this document. The facial expression area unit extracted and at last the facial expressions area unit distinct from the shifting the characteristics of the world mark on the face from the formula projected by the author victimization Classifier supported AdaBoost. Therefore, the rate of accuracy is 90%. As shown in Figure 4 authors have different opinion for classifications and accuracy rate [20]. Table 1 shows the analysis of different methods and technique used for emotion detection with their accuracy. Figure 4. Classification and accuracy rate Table 1. Analysis of different methods and its accuracy Methods Recognition Accuracy(%) No. Of Expressions Recognized Advantages And Major Contribution Ref. and Year CNN Not reported Not reported -Multiscale feature extractor - In plane pose variation [21] 2002 dLHD and LEM 86.6 3 Oriented structural features [22] 2003 RVM-relevance vector machine 90.84 - Recognition in Static images [23] 2005 Multi stream HMMs - - Recognition error reduced to 44 % compared to Single stream [24]2006 ID3 decision tree 75 6 -Cost effective with respect to accuracy and with speed [25] 2007 LVQ and GF 88.86 Not reported Efficient emotion detection [26]2008 SVM and GASM 93.85 6 Learning using Adaboost -Selection was flexible [27]2009 5 parallel bayesian classifiers - - -Multi classification is achieved -Static and real time video [28] 2010 SVM 82.5 6 -Based on distance features -Effective recognition highest CRR [29]2011 LBP,SVM 95.84 6 -Image based recognition -Spatial temporal feature [30]2012 Lines of connectivity 93.8 3 Triangular face using LC and geometric approach [31]2013 GF,MFFNN 94.16 7 Computational cost is very less [32]2014 SVM and DCT 98.63 6 Fast and high accuracy [33]2015 OSELM-SC 95.17 6 -Online sequential extreme learning machine -Good recognition accuracy and robustness [34]2017 DCT,GF,SVM More than 90 percent 8 Good Detection rate in various light source [35]2018 CNN,SVM - - Better classification compared to state of art CNN -Deep learning Features [36]2019
  • 7. Int J Elec & Comp Eng ISSN: 2088-8708  Analysis on techniques used to recognize and identifying the Human emotions (Praveen Kulkarni) 3313 4. CONCLUSION The purpose of this paper is to identify face detection problems and challenges and compare numerous ways for face detection. There’s a major advancement in this area because it is helpful in real-world application product. Various face detection techniques are summarized, and eventually methods are mentioned for face detection, their options, advantages and accuracy. This paper compares algorithms which are helpful for emotion detection based on their accuracy and recent development. There is still an honest scope for work to urge economical results by combining or raising the choice of options for detection of face pictures in spite of intensity of background color or any occlusion.The important feature enhancements which are discussed from recent papers are emotion detection from the side views and different parameters for real time applications such as medical, robotics, forensic section and many more. REFERENCES [1] P. Ekman, and W. V Friesen, “Facial action coding system: Manual,” Palo Alto, Calif: Consulting Psychologists Press, 1978. [2] I. M. Revina, and W. R. Sam Emmanuel, “A Survey on Human Face Expression Recognition Techniques,” Journal of King Saud University-Computer and Information Sciences, 2018. [3] Ming-Hsuan Yang, D. J. Kriegman, and N. Ahuja, “Detecting Faces in Images: A survey,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 1, pp. 34-58, 2002. [4] A. Punitha, and M. K. Geetha, “Texture Based Emotion Recognition from Facial Expressions using Support Vector Machine,” Int. J. Comput. Appl, vol. 80, no. 5, pp. 1-5, 2013. [5] K. C. Yow, and R. Cipolla, “Feature-Based Human Face Detection,” Image and Vision Computing, vol. 15, no. 9, pp. 713-735, 1997. [6] S. Kr. Singh, D. S. Chauhan, and M. Vatsa, and R. Singh, “A Robust Skin Color Based Face Detection Algorithm,” Tamkang Journal of Science and Engineering, vol 6, no. 4, pp. 227-234, 2003 [7] J. Li, G. Poulton, Y. Guo, and Rong-Yu Qiao, “Face Recognition Based on Multiple Region Features,” DICTA, pp. 69-78, 2003. [8] R. Patil, and V. Sahula, and AS Mandal, “Automatic Detection of Facial Feature Points in Image Sequences,” International Conference on Image Information Processing, pp. 1-5, 2011. [9] W. Kienzle, et al., “Face Detection-Efficient and Rank Deficient,” Advances in Neural Information Processing Systems, pp. 673-680, 2005. [10] P. Viola, M. Jones, “Robust Real-Time Object Detection,” International Journal of Computer Vision, vol. 4, pp. 34-47, 2001. [11] G Hemalatha, and CP Sumathi, “A Study of Techniques for Facial Detection and Expression Classification,” International Journal of Computer Science and Engineering Survey, vol. 5, no. 2, 2014. [12] Y. Tayal, P. K. Pandey and D. B. V. Singh, “Face Recognition using Eigenface,” International Association of Scientific Innovation and Research (IASIR), vol. 3, no. 1, pp. 50-55, 2013. [13] Y-Buhee and sukhanlee, “Robust Face Detection Based on Knowledge‐Directed Specification of Bottom‐Up Saliency,” ETRI journal, vol. 33, no. 4, 2011. [14] C. Shan, S. Gong, and P. W. McOwan, “Facial Expression Recognition Based on Local Binary Patterns: A Comprehensive Study,” Image and vision Computing, vol. 27, no. 6. pp. 803-816, 2009. [15] R. A. Patil, V. Sahula and AS Mandal, “Facial Expression Recognition in Image Sequences using Active Shape Model and Support Vector Machine,” 2011 UKSIM 5th European Symposium on Computer Modeling and Simulation, 2011. [16] J. Kalita, and K. Das, “Recognition of Facial Expression using Eigenvector Based Distributed Features and Euclidean Distance Based Decision Making Technique,” IJACSA, vol. 4, no. 2, 2013. [17] Z. Zhang, X. Mu, L. Gao, “Recognizing Facial Expressions based on Gabor Filter Selection,” 2011 4th International Congress on Image and Signal Processing, vol. 3, pp. 1544-1548, 2011. [18] Z. Niu, and X. Qiu, “Facial Expression Recognition based on Weighted Principal Component Analysis and Support Vector Machines,” 2010 3rd International Conference on Advanced Computer Theory and Engineering (ICACTE), 2010. [19] Li-Fen Chen, et al., “A New LDA-Based Face Recognition System which can Solve the Small Sample Size Problem,” Pattern Recognition, vol. 33, no. 3, pp. 1713-1726, 2000. [20] G Hemalatha, and CP Sumathi, “A Study of Techniques for Facial Detection and Expression Classification,” International Journal of Computer Science and Engineering Survey, vol. 5, no. 2, pp. 27-37, 2014. [21] B. Fasel, “Head-Pose Invariant Facial Expression Recognition using Convolutional Neural Networks,” Proceedings of the 4th IEEE International Conference on Multimodal Interfaces, pp. 529, 2002 [22] D. Datcu and L. J. M. Rothkrantz, “Facial Expression Recognition with Relevance Vector Machines, 2005 IEEE International Conference on Multimedia and Expo, pp. 193-196, 2005. [23] P.S. Katsaggelos, and A.K Aggelos, “Automatic Facial Expression Recognition using Facial Animation Parameters and Multistream HMMs,” IEEE Transactions on Information Forensics and Security, vol. 1, no. 1, pp. 3-11, 2006. [24] P. Ekman, and W. V. Friesen, “Unmasking the Face: A Guide to Recognizing Emotions From Facial Clues,” Malor Books, 2003. [25] S. Noh, H. Park, Y. Jin, Jong-Il. Park, “Feature-Adaptive Motion Energy Analysis for Facial Expression Recognition,” International Symposium on Visual Computing, pp. 452-463, 2007.
  • 8.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3307 - 3314 3314 [26] Y. Sun, L. Yin, “Facial Expression Recognition based on 3D Dynamic Range Model Sequences,” European Conference on Computer Vision, pp. 58–71, 2008. [27] G. Zhao, and M. Pietik, “Boosted Multi-Resolution Spatiotemporal Descriptors for Facial Expression Recognition,” Pattern Recognition Letters, vol. 30, no. 12, pp. 1117-1127, 2009. [28] S. B. Kazmi, Qurat-ul-Ain, M. Ishtiaq, M. A. Jaffar, “Texture Analysis Based Facial Expression Recognition using A Bank Of Bayesian Classifiers,” 2010 International Conference on Information and Emerging Technologies, pp. 1-6, 2010. [29] L. Zhang, and D. Tjondronegoro, “Facial Expression Recognition using Facial Movement Features,” IEEE Transactions on Affective Computing, vol. 2, no. 4, pp. 219-229, 2011. [30] A. Poursaberi, H. A. Noubari, M. Gavrilova, and S. N. Yanushkevich, “Gauss–Laguerre Wavelet Textural Feature Fusion with Geometrical Information for Facial Expression Identification,” EURASIP Journal on Image and Video Processing, vol. 2012, no. 1, pp. 17, 2012. [31] M. R. Dileep, and A. Danti, “Lines Of Connectivity-Face Model for Recognition of the Human Facial Expressions,” Int. J. Artif. Intell. Mech, vol. 2, no. 2, 2013 [32] E. Owusu, Y. Zhan, and Q. R. Mao, “A Neural-Adaboost Based Facial Expression Recognition System,” Expert Systems with Applications, vol. 41, no. 7, pp. 3383-3390, 2014. [33] S. Biswas, and J. Sil, “An Efficient Expression Recognition Method using Contourlet Transform,” Proceedings of the 2nd International Conference on Perception and Machine Intelligence, pp. 167-174, 2015. [34] A. Ucar, Y. Demir and C. Guzelis. “A New Facial Expression Recognition based on Curvelet Transform and Online Sequential Extreme Learning Machine Initialized with Spherical Clustering,” Neural Computing and Applications, vol. 27, no. 1, pp. 131-142, 2016. [35] Hung-Hsu Tsai,Yi-Cheng Chang, “Facial Expression Recognition using A Combination of Multiple Facial Features and Support Vector Machine,” Soft Computing, vol. 22, no. 13, pp. 4389-4405, 2018. [36] FengyuanWang, Jianhua Lv, Guode Ying, Shenghui Chen, Chi Zhang, “Facial Expression Recognition from Image based on Hybrid Features Understanding,” Journal of Visual Communication and Image Representation, vol. 59, pp. 84-88, 2019.