A Real Time Facial Emotion
Recognition Using Depth Sensor And
Interfacing With Second Life Based
Virtual 3D Avatar
Mounika...
ABSTRACT
• Facial emotions are the nonverbal form of communication and
we find it very useful when we have no words to exp...
ABSTRACT
• Second Life is the 3D Virtual world where people are able to
create a digital character called ‘Avatar’ and the...
DETAILS OF WORK IN PHASES
• PHASE – I
EMOTION RECOGNITION USING DEPTH
SENSOR
• PHASE – II
EMOTIONS IN VIRTUAL 3D AVATAR
...
INTRODUCTION – KINECT SENSOR
• Kinect is motion sensing input device developed by
Microsoft, they could interact and contr...
INTRODUCTION – KINECT SENSOR
• The Natural User Interface (NUI) is the core of the Kinect for
Windows API.
• Through the N...
INTORDUCTION – SECOND LIFE
• Second Life is an online virtual world, developed by Linden
Lab, launched on June 23, 2003.
•...
INTRODUCTION – SECOND LIFE
• Built into the software is a three-dimensional modelling tool
based on simple geometric shape...
PHASE – I
FLOWCHART FOR EMOTIONS USING
KINECT SENSOR
REAL TIME RESULTS OF EMOTIONS
EMOTION – SMILE
EMOTION – SURPRISE
EMOTION – FEAR
EMOTION – ANGER
EMOTION – SAD
PHASE – II
FLOWCHART FOR EMOTIONS USING
SECOND LIFE
EMOTIONS OF AVAATARS IN SECOND
LIFE
INTERFCAING THE EMOTIONS FROM
KINECT SENSOR WITH THOSE OF
AVATARS IN SECOND LIFE
• Added ‘.dll’ file which enables transfe...
CONCLUION & FUTURE WORK
• FACS has been used to define the Aus of the emotions. The
connection of emotions to the 3D Virtu...
REFERENCES
• Abdallah A. Mohamed and Roman V. Yampolskiy, Using Discrete Wavelet Transform and
Eigen faces for Recognizing...
A real time facial emotion recognition using 3D sensor and interfacing the results to virtual 3D avatar
Upcoming SlideShare
Loading in …5
×

A real time facial emotion recognition using 3D sensor and interfacing the results to virtual 3D avatar

1,763 views

Published on

IEEE published paper.
Copywrights!!!

0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
1,763
On SlideShare
0
From Embeds
0
Number of Embeds
33
Actions
Shares
0
Downloads
0
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

A real time facial emotion recognition using 3D sensor and interfacing the results to virtual 3D avatar

  1. 1. A Real Time Facial Emotion Recognition Using Depth Sensor And Interfacing With Second Life Based Virtual 3D Avatar Mounika Kakarla
  2. 2. ABSTRACT • Facial emotions are the nonverbal form of communication and we find it very useful when we have no words to express our feelings then we use the gestures to express the same. To detect and track emotions dynamically, Kinect Sensor is used, since it has camera to produce 3D depth maps. • The Facial emotions are detected in real-time, by applying the mesh over the tracked face and thereby identifying the desired points for extracting the features. Facial Action Code Systems (FACS) and Facial Animated Parameters (FAP) are the places of interest for depicting the emotions.
  3. 3. ABSTRACT • Second Life is the 3D Virtual world where people are able to create a digital character called ‘Avatar’ and thereby interacting with the people in the virtual world. • The Avatar, which shows some gestures, needs some emotions in a particular scenario. • Using the real-time facial emotion recognition based on Kinect depth sensor, avatar emotions are also generated in the 3D Virtual world using Second Life. This idea can be further extended to serve as a communication link so that speech and hearing impaired people can express their emotions.
  4. 4. DETAILS OF WORK IN PHASES • PHASE – I EMOTION RECOGNITION USING DEPTH SENSOR • PHASE – II EMOTIONS IN VIRTUAL 3D AVATAR INTERFACING THE EMOTIONS BY DEPTH SENSOR AND BY VIRTUAL AVATAR
  5. 5. INTRODUCTION – KINECT SENSOR • Kinect is motion sensing input device developed by Microsoft, they could interact and control by their gestures and movement of body without using the remote control.
  6. 6. INTRODUCTION – KINECT SENSOR • The Natural User Interface (NUI) is the core of the Kinect for Windows API. • Through the NUI, developer would get the sensor data such as audio, depth and image stream in application. • There are two methods for getting the image frames, polling and event models. • The polling model is used to read data frames. • The event model supports the ability to use those data streams with more accuracy and flexibility.
  7. 7. INTORDUCTION – SECOND LIFE • Second Life is an online virtual world, developed by Linden Lab, launched on June 23, 2003. • A number of free client programs, or Viewers as they are called in Second Life are used to use the Second Life world so the users in Second Life, called Residents, can interact with each other through avatars. • Residents can explore the world (known as the grid), meet other residents, socialize, participate in individual and group activities, and create and trade virtual property and services with one another.
  8. 8. INTRODUCTION – SECOND LIFE • Built into the software is a three-dimensional modelling tool based on simple geometric shapes that allows residents to build virtual objects. • There is also a procedural scripting language, Linden Scripting Language, which can be used to add interactivity to objects. • Sculpted primes (sculptures), mesh, textures for clothing or other objects, animations, and gestures can be created using external software and imported.
  9. 9. PHASE – I
  10. 10. FLOWCHART FOR EMOTIONS USING KINECT SENSOR
  11. 11. REAL TIME RESULTS OF EMOTIONS
  12. 12. EMOTION – SMILE
  13. 13. EMOTION – SURPRISE
  14. 14. EMOTION – FEAR
  15. 15. EMOTION – ANGER
  16. 16. EMOTION – SAD
  17. 17. PHASE – II
  18. 18. FLOWCHART FOR EMOTIONS USING SECOND LIFE
  19. 19. EMOTIONS OF AVAATARS IN SECOND LIFE
  20. 20. INTERFCAING THE EMOTIONS FROM KINECT SENSOR WITH THOSE OF AVATARS IN SECOND LIFE • Added ‘.dll’ file which enables transfer of keys. • Invocation of application is made by calling instance of the class. • Keyboard strokes defined where ever emotions are detected.
  21. 21. CONCLUION & FUTURE WORK • FACS has been used to define the Aus of the emotions. The connection of emotions to the 3D Virtual Avatar of the 3D Virtual world in Second Life is achieved by implementing small changes to the code. • This project can be further extended in developing an empathy machine, which helps the speech and hearing impaired people can easily express their emotions with others.
  22. 22. REFERENCES • Abdallah A. Mohamed and Roman V. Yampolskiy, Using Discrete Wavelet Transform and Eigen faces for Recognizing Avatars Faces, Department of Computer Engineering and Computer Science, University of Louisville, Louisville, USA,2012 • P-Ekman, W.V Fiensen and P.Ellsworth,Emotion In The Human Face : Guidelines for Research and an Integration of Findings, Pergamon Press Inc., 1972 • F. ABDAT, C. MAAOUI and A. PRUSKI,Human-computer interaction using emotion recognition from facial expression,Laboratoire d’Automatique humane et de Sciences Comportementales,Universite de metz,Metz, France,2011 • Songfan Yang and Bir Bhanu ,Facial Expression Recognition Using Emotion Avatar Image ,Center for Research in Intelligent Systems, University of California, Riverside,2010 • Albert Cruz and Bir Bhanu,A BIOLOGICALLY INSPIRED APPROACH FOR FUSING FACIAL EXPRESSION AND APPEARANCE FOR EMOTION RECOGNITION, Center for Research in Intelligent Systems, University of California, Riverside,2011 • Majdi Dammak, Mohamed Ben Ammar, Adel M. Alimi, Real-Time Analysis of non-Verbal upper-Body Expressive Gestures ,REGIM: REsearch Group on Intelligent Machines, University of Sfax,2012 • Cohn Kanade, Cohn-Kanade AU-Coded Facial Expression Database, Carnegie Mellon University, Robotics Institute, February 2011 • Klaus Scherer, GEMEP-FERA Data set , Facial Recognition and Analysis Challenge , 2011.

×