Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
EMOTION RECOGNITION USING SMARTPHONES 
-Madhusudhan (17)
OBJECTIVE 
• To propose the development of android applications 
that can be used for sensing the emotions of people 
for ...
INTRODUCTION 
• Emotions control your thinking, behavior and 
actions. 
• Emotions affect your physical bodies as much as ...
DETECTION 
• Detection of emotional info can be done with 
passive sensors which capture data about the user's 
physical s...
• A pressure sensor/accelerometer can capture heart 
rate. 
• Other sensors detect emotional cues from skin 
temperature.
RECOGNIZING 
• Extraction of information for meaningful patterns 
from the gathered data 
• Speech recognition, natural la...
SPEECH 
• Requires the creation of a reliable database 
or knowledge base as well as the selection of a 
successful classi...
FACIAL EXPRESSIONS 
• Defining expressions in terms of muscle actions A 
system has been conceived in order to formally 
c...
• By identifying different 
facial cues, scientists are 
able to map them to their 
corresponding Action Unit 
code 
• The...
HEART BEAT 
• One of the most commonly used techniques is by 
using pressure sensors/accelerometer. 
• The heart rate can ...
• It is used as a statistical feature to distinguish the 
emotional stress, through a nonlinear classifier (K 
Nearest Nei...
CONCLUSION 
• To understand the emotions with the help of 
smartphones will help in achieving greater success 
in the live...
REFERENCES 
• http://www.mkprojects.com/fa_emotions.html by By Mary 
Kurus 
• Emotion Recognition from Speech by Ankur Sap...
THANK YOU
Emotion recognition
Upcoming SlideShare
Loading in …5
×

Emotion recognition

2,321 views

Published on

:-)

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Emotion recognition

  1. 1. EMOTION RECOGNITION USING SMARTPHONES -Madhusudhan (17)
  2. 2. OBJECTIVE • To propose the development of android applications that can be used for sensing the emotions of people for their better health. • To provide better services and also better Human-machine interactions
  3. 3. INTRODUCTION • Emotions control your thinking, behavior and actions. • Emotions affect your physical bodies as much as your body affects your feelings and thinking. • People who ignore, dismiss or repress their emotions, are setting themselves up for physical illness.
  4. 4. DETECTION • Detection of emotional info can be done with passive sensors which capture data about the user's physical state or behavior without interpreting the input. • A video camera might capture facial expressions, body posture and gestures. • A microphone can capture speech.
  5. 5. • A pressure sensor/accelerometer can capture heart rate. • Other sensors detect emotional cues from skin temperature.
  6. 6. RECOGNIZING • Extraction of information for meaningful patterns from the gathered data • Speech recognition, natural language processing, or facial expression detection • These techniques should either produce labels or efficient inference algorithms to extract high-level information from the data.
  7. 7. SPEECH • Requires the creation of a reliable database or knowledge base as well as the selection of a successful classifier which will allow for quick and accurate emotion identification. • Currently, the most frequently used classifiers are linear discriminant classifiers (LDC), k-nearest neighbour (k-NN), Gaussian mixture model (GMM), support vector machines (SVM), artificial neural networks (ANN), decision tree algorithms and hidden Markov models (HMMs).
  8. 8. FACIAL EXPRESSIONS • Defining expressions in terms of muscle actions A system has been conceived in order to formally categorize the physical expression of emotions. • By studying the contraction or a relaxation of one or more muscles. • The concept of the Facial Action Coding System, or FACS created by Paul Ekman and Wallace V. Friesen in 1978 • Eg. Affdex
  9. 9. • By identifying different facial cues, scientists are able to map them to their corresponding Action Unit code • They have proposed the following classification of the six basic emotions, according to their Action Units Emotion Action Units Happiness 6+12 Sadness 1+4+15 Surprise 1+2+5B+26 Fear 1+2+4+5+20+2 6 Anger 4+5+7+23 Disgust 9+15+16 Contempt R12A+R14A
  10. 10. HEART BEAT • One of the most commonly used techniques is by using pressure sensors/accelerometer. • The heart rate can also be collected using the Optical pulse sensor Eg. Samsung galaxy S5. • Heart Rate Variability (HRV) signals is derived from ECG signals through QRS detection algorithm.
  11. 11. • It is used as a statistical feature to distinguish the emotional stress, through a nonlinear classifier (K Nearest Neighbor (KNN))into three different classes namely, negative emotions, positive emotions (surprise and happy) and neutral.
  12. 12. CONCLUSION • To understand the emotions with the help of smartphones will help in achieving greater success in the lives of people. • Using smartphones to do this will make it much more easier for researchers and scientists.
  13. 13. REFERENCES • http://www.mkprojects.com/fa_emotions.html by By Mary Kurus • Emotion Recognition from Speech by Ankur Sapra, Nikhil Panwar, Sohan Panwar -Jaypee Institute of Information Technology, Noida • http://en.wikipedia.org/wiki/Affective_computing#Emotional_s peech • Mobile Sensor Data Collector using Android Smartphone by Won-Jae Yi, Weidi Jia, and Jafar Saniie - Department of Electrical and Computer Engineering, Illinois Institute of Technology • EmotionSense: A Mobile Phones based Adaptive Platform for Experimental Social Psychology Research
  14. 14. THANK YOU

×