Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Automatic facial expression recognition using features of salient facial patches

28 views

Published on

Final Year IEEE Projects for BE, B.Tech, ME, M.Tech,M.Sc, MCA & Diploma Students latest Java, .Net, Matlab, NS2, Android, Embedded,Mechanical, Robtics, VLSI, Power Electronics, IEEE projects are given absolutely complete working product and document providing with real time Software & Embedded training......

JAVA, .NET, NS2, MATLAB PROJECTS:

Networking, Network Security, Data Mining, Cloud Computing, Grid Computing, Web Services, Mobile Computing, Software Engineering, Image Processing, E-Commerce, Games App, Multimedia, etc.,

EMBEDDED SYSTEMS:

Embedded Systems,Micro Controllers, DSC & DSP, VLSI Design, Biometrics, RFID, Finger Print, Smart Cards, IRIS, Bar Code, Bluetooth, Zigbee, GPS, Voice Control, Remote System, Power Electronics, etc.,

ROBOTICS PROJECTS:

Mobile Robots, Service Robots, Industrial Robots, Defence Robots, Spy Robot, Artificial Robots, Automated Machine Control, Stair Climbing, Cleaning, Painting, Industry Security Robots, etc.,

MOBILE APPLICATION (ANDROID & J2ME):

Android Application, Web Services, Wireless Application, Bluetooth Application, WiFi Application, Mobile Security, Multimedia Projects, Multi Media, E-Commerce, Games Application, etc.,

MECHANICAL PROJECTS:

Auto Mobiles, Hydraulics, Robotics, Air Assisted Exhaust Breaking System, Automatic Trolley for Material Handling System in Industry, Hydraulics And Pneumatics, CAD/CAM/CAE Projects, Special Purpose Hydraulics And Pneumatics, CATIA, ANSYS, 3D Model Animations, etc.,

CONTACT US:

ECWAY TECHNOLOGIES
23/A, 2nd Floor, SKS Complex,
OPP. Bus Stand, Karur-639 001.
TamilNadu , India.Cell: +91 9894917187.
Website: www.ecwayprojects.com | www.ecwaytechnologies.com
Mail to: ecwaytechnologies@gmail.com.

Published in: Engineering
  • Be the first to comment

  • Be the first to like this

Automatic facial expression recognition using features of salient facial patches

  1. 1. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111 ECWAY TECHNOLOGIES IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com AUTOMATIC FACIAL EXPRESSION RECOGNITION USING FEATURES OF SALIENT FACIAL PATCHES By A PROJECT REPORT Submitted to the Department of electronics &communication Engineering in the FACULTY OF ENGINEERING & TECHNOLOGY In partial fulfillment of the requirements for the award of the degree Of MASTER OF TECHNOLOGY IN ELECTRONICS &COMMUNICATION ENGINEERING APRIL 2016
  2. 2. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111 ECWAY TECHNOLOGIES IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com CERTIFICATE Certified that this project report titled “Automatic Facial Expression Recognition Using Features of Salient Facial Patches” is the bonafide work of Mr. _____________Who carried out the research under my supervision Certified further, that to the best of my knowledge the work reported herein does not form part of any other project report or dissertation on the basis of which a degree or award was conferred on an earlier occasion on this or any other candidate. Signature of the Guide Signature of the H.O.D Name Name
  3. 3. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111 ECWAY TECHNOLOGIES IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com DECLARATION I hereby declare that the project work entitled “Automatic Facial Expression Recognition Using Features of Salient Facial Patches” Submitted to BHARATHIDASAN UNIVERSITY in partial fulfillment of the requirement for the award of the Degree of MASTER OF APPLIED ELECTRONICS is a record of original work done by me the guidance of Prof.A.Vinayagam M.Sc., M.Phil., M.E., to the best of my knowledge, the work reported here is not a part of any other thesis or work on the basis of which a degree or award was conferred on an earlier occasion to me or any other candidate. (Student Name) (Reg.No) Place: Date:
  4. 4. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111 ECWAY TECHNOLOGIES IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com ACKNOWLEDGEMENT I am extremely glad to present my project “Automatic Facial Expression Recognition Using Features of Salient Facial Patches” which is a part of my curriculum of third semester Master of Science in Computer science. I take this opportunity to express my sincere gratitude to those who helped me in bringing out this project work. I would like to express my Director,Dr. K. ANANDAN, M.A.(Eco.), M.Ed., M.Phil.,(Edn.), PGDCA., CGT., M.A.(Psy.)of who had given me an opportunity to undertake this project. I am highly indebted to Co-OrdinatorProf. Muniappan Department of Physics and thank from my deep heart for her valuable comments I received through my project. I wish to express my deep sense of gratitude to my guide Prof. A.Vinayagam M.Sc., M.Phil., M.E., for her immense help and encouragement for successful completion of this project. I also express my sincere thanks to the all the staff members of Computer science for their kind advice. And last, but not the least, I express my deep gratitude to my parents and friends for their encouragement and support throughout the project.
  5. 5. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111 ECWAY TECHNOLOGIES IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com ABSTRACT: Extraction of discriminative features from salient facial patches plays a vital role in effective facial expression recognition. The accurate detection of facial landmarks improves the localization of the salient patches on face images. This paper proposes a novel framework for expression recognition by using appearance features of selected facial patches. A few prominent facial patches, depending on the position of facial landmarks, are extracted which are active during emotion elicitation. These active patches are further processed to obtain the salient patches which contain discriminative features for classification of each pair of expressions, thereby selecting different facial patches as salient for different pair of expression classes. One-against-one classification method is adopted using these features. In addition, an automated learning-free facial landmark detection technique has been proposed, which achieves similar performances as that of other state-of-art landmark detection methods, yet requires significantly less execution time. The proposed method is found to perform well consistently in different resolutions, hence, providing a solution for expression recognition in low resolution images. Experiments on CKþ and JAFFE facial expression databases show the effectiveness of the proposed system.
  6. 6. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111 ECWAY TECHNOLOGIES IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com INTRODUCTION: Facial expression, being a fundamental mode of communicating human emotions, finds its applications in human-computer interaction (HCI), health-care, surveillance, driver safety, deceit detection etc. Tremendous success being achieved in the fields of face detection and face recognition, affective computing has received substantial attention among the researchers in the domain of computer vision. Signals, which can be used for affect recognition, include facial expression, paralinguistic features of speech, body language, physiological signals (e.g. Electromyogram (EMG), Electrocardiogram (ECG), Electrooculogram (EOG), Electroencephalography (EEG), Functional Magnetic Resonance Imaging (fMRI) etc.). A review of signals and methods for affective computing is reported in according to which, most of the research on facial expression analysis are based on detection of basic emotions : anger, fear, disgust, happiness, sadness, and surprise. A number of novel methodologies for facial expression recognition have been proposed over the last decade. Effective expression analysis hugely depends upon the accurate representation of facial features. Facial Action Coding System (FACS) represents face by measuring all visually observable facial movements in terms of Action Units (AUs) and associates them with the facial expressions. Accurate detection of AUs depends upon proper identification and tracking of different facial muscles irrespective of pose, face shape, illumination, and image resolution. According to Whitehill et al. The detection of all facial fiducial points is even more challenging than expression recognition itself. Therefore, most of the existing algorithms are based on geometric and appearance based features.The models based on geometric features track the shape and size of the face and facial components such as eyes, lip corners, eyebrows etc., and categorize the expressions based on relative position of these facial components. Some researchers. used shape models based on a set of characteristic points on the face to classify the expressions. However, these methods usually require very accurate and reliable detection as well as tracking of the facial landmarks which are difficult to achieve in many practical situations. Moreover, the distance between facial landmarks vary from person to person, thereby making the person independent expression recognition system less reliable. Facial expressions involve change in local texture. In appearance-based methods a bank of filters such as Gabor wavelets, Local
  7. 7. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111 ECWAY TECHNOLOGIES IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com Binary Pattern (LBP) etc. are applied to either the wholeface or specific face regions to encode the texture. The superior performance of appearance based methods to the geometry based features is reported in the appearancebased methods generates high dimensional vector which are further represented in lower dimensional subspace by applying dimensionality reduction techniques, such as principal component analysis (PCA), linear discriminant analysis (LDA) etc. Finally, the classification is performed in learned subspace. Although the time and space costs are higher in appearance based methods, the preservation of discriminative information makes them very popular. Extraction of facial features by dividing the face region into several blocks achieves better accuracy as reported by many researchers. However, this approach fails with improper face alignment and occlusions. Some earlier works, on extraction of features from specific face regions mainly determine the facial regions which contributes more toward discrimination of expressions based on the training data. However, in these approaches, the positions and sizes of the facial patches vary according to the training data. Therefore, it is difficult to conceive a generic system using these approaches. In this paper, we propose a novel facial landmark detection technique as well as a salient patch based facial expression recognition framework with significant performance at different image resolutions. The proposed method localizes face as well as the facial landmark points in an image, thereby extracting some salient patches that are estimated during training stage. The appearance features from these patches are fed to a multi-class classifier to classify the images into six basic expression classes. It is found that the proposed facial landmark detection system performs similar to the state-of-the-art methods in near frontal images with lower computational complexity. The appearance features with lower number of histogram bins are used to reduce the computation.Empirically the salient facial patches are selected with predefined positions and sizes, which contribute significantly towards classification of one expression from others. Once the salient patches are selected, the expression recognition becomes easy irrespective of the data. Affective computing aims at effective emotion recognition in low resolution images. The experimental results shows that the proposed system performed better in low resolution images.
  8. 8. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111 ECWAY TECHNOLOGIES IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com The paper is organized as follows. Section 2 presents a review of earlier works. The proposed framework is presented in Section 3. Sections 4 and 5 discusses the facial landmark detection and feature extraction technique respectively. Experimental results and discussion are provided in Section 6. Section 7 concludes the paper
  9. 9. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111 ECWAY TECHNOLOGIES IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com CONCLUSION: This paper has presented a computationally efficient facial expression recognition system for accurate classification of the six universal expressions. It investigates the relevance of different facial patches in the recognition of different facial expressions. All major active regions on face are extracted which are responsible for the face deformation during an expression. The position and size of these active regions are predefined. The system analyses the active patches and determines the salient areas on face where the features are discriminative for different expressions. Using the appearance features from the salient patches, the system performs the one-against-one classification task and determines the expression based on majority vote. In addition, a facial landmark detection method is described which detects some facial points accurately with less computational cost. Expression recognition is carried out using the proposed landmark detection method as well as the recently proposed CLM model based on DRMF method. In both cases, recognition accuracy is almost similar, whereas computational cost of the proposed learningfree method is significantly less. Promising results has been obtained by using block based LBP histogram features of the salient patches. Extensive experiments has been carried out on two facial expression databases and the combined data set. Experiments are conducted using various binwidths of LBP histograms, uniform LBP and rotation invariant LBP features. Low dimensional features are preferred, with suffi- cient recognition accuracy, to decrease computational complexity. Therefore, the 16-bin LBP histogram features are used from four salient patches at face resolution of 96x96 for obtaining best performance with a suitable trade-off between speed and accuracy. Our system found the classification between anger and sadness troublesome in all databases. The system appears to perform well in CKþ data set with an F-score of 94.39 percent. Using the salient patches obtained by training on CKþ data set, the system achieves an F-score of 92.22 percent in JAFFE data set. This proves the generic performance of the system. The performance of the proposed system is comparable with the earlier works with similar approach, nevertheless our system is fully automated. Interestingly, the local features at the salient patches provide consistent performance at different resolutions. Thus, the proposed method can be employed in real-world applications with low resolution imaging in real-time. Security cameras, for instance, provide low resolution images which can be analysed effectively using the proposed
  10. 10. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111 ECWAY TECHNOLOGIES IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com framework. Instead of the whole face, the proposed method classifies the emotion by assessing a few facial patches. Thus, there is a chance of improvement in performance with partially occluded images which has not been addressed in this study. This analysis is confined to databases without facial hairs. There is a possibility of improvement by using different appearance features. Dynamics of expression in temporal domain is also not considered in this study. It would be interesting to explore the system incorporated with motion features from different facial patches. The execution time reported for the proposed algorithm is based on an un-optimized MATLAB code. However, the optimal implementation of the proposed framework will significantly improve the computational cost and real-time expression recognition can be achieved with substantial accuracy. Further analysis and efforts are required to improve the performance by addressing some of the above mentioned issues.
  11. 11. OUR OFFICES @CHENNAI/ TRICHY / KARUR / ERODE / MADURAI / SALEM / COIMBATORE / BANGALORE / HYDRABAD CELL: +91 9894917187 | 875487 1111 / 2111 / 3111 / 4111 / 5111 / 6111 ECWAY TECHNOLOGIES IEEE SOFTWARE | EMBEDDED | MECHANICAL | ROBOTICS PROJECTS DEVELOPMENT Visit: www.ecwaytechnologies.com | www.ecwayprojects.com Mail to: ecwaytechnologies@gmail.com REFERENCES: [1] R. A. Calvo and S. D’Mello, “Affect detection: An interdisciplinary review of models, methods, and their applications,” IEEE Trans. Affective Comput., vol. 1, no. 1, pp. 18–37, Jan. 2010. [2] J. Whitehill, M. S. Bartlett, and J. Movellan, “Automatic facial expression recognition,” in Social Emotions in Nature and Artifact. London, U.K.: Oxford Univ. Press, 2013. [3] S. M. Lajevardi and Z. M. Hussain, “Automatic facial expression recognition: Feature extraction and selection,” Signal, Image Video Process., vol. 6, no. 1, pp. 159–169, 2012. [4] G. Zhao and M. Pietikainen, “Dynamic texture recognition using local binary patterns with an application to facial expressions,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 29, no. 6, pp. 915– 928, Jun. 2007. [5] T. Jabid, M. Kabir, and O. Chae, “Robust facial expression recognition based on local directional pattern,” ETRI J., vol. 32, pp. 784– 794, 2010. [6] M. H. Kabir, T. Jabid, and O. Chae, “A local directional pattern variance (LDPv) based face descriptor for human facial expression recognition,” in Proc. 7th IEEE Int. Conf. Adv. Video Signal Based Surveillance, 2010, pp. 526–532. [7] C. Shan and R. Braspenning, “Recognizing facial expressions automatically from video,” in Handbook of Ambient Intelligence and Smart Environments, New York, NY, USA: Springer- Verlag, 2010, pp. 479–509.

×