EmotionSense: Real-Time
Emotion Detection
System
EmotionSense is a cutting-edge system designed to detect human
emotions in real-time. By leveraging machine learning and
computer vision, it interprets emotional states from facial
expressions. This interactive system aims to understand and react
to human emotions dynamically.
by Vanshree
Awasthi
VA
Core Technologies
Behind EmotionSense
Python
The primary programming language for
development.
OpenCV
Handles image and video processing tasks.
TensorFlow/Keras
Powers the CNN model for emotion recognition.
FER-2013 Dataset
Used for training the emotion detection model.
EmotionSense Key Features
Real-Time Detection
Processes webcam feed
in real time.
Seven Emotion
Classes
Trained on FER2013
dataset.
Lightweight Model
Ensures fast inference
and efficient processing.
Craphical Interface
Provides user-friendly
interaction.
Objectives of
EmotionSense
1
Accurate Detection
Detect facial expressions
accurately.
2
Efficient CNN Model
Create a lightweight CNN
model.
3
Emotional Feedback
Provide feedback based on visual
input.
4
Enhanced Interaction
Improve user interaction using emotion
awareness.
Scope and
Applications
Mental Health Monitoring
Aids early detection of emotional
distress.
Education
Enhances understanding of student
engagement.
Customer Service
Improves customer experience and
satisfaction.
Human-Computer Interaction
Creates more responsive systems.
Project Flow: How It
Works
1
Input
Image or video input is
received.
2 Preprocessing
Input data undergoes
preprocessing.
3
Emotion Detection Model
CNN model analyzes facial
features.
4 Emotion Prediction
Emotion is predicted based on the
model.
5
Output
Results are displayed via GUI or
logs.
EmotionSense: ER Diagram
Entity Attributes
User user_id, name, age, gender
Session session_id, user_id, timestamp
Emotion_Record record_id, session_id,
detected_emotion,
confidence_score
EmotionSense: Test Cases
Test Case Input Expected
Output
Actual
Output
Status
TC01 Happy
face
image
"Happy" "Happy" Pass
TC02 Angry
face
image
"Angry" "Angry" Pass
TC03 Neutral
face
"Neutral" "Neutral" Pass
TC04 Blurred
image
"Error/Lo
w
Confidenc
e"
"Low
Confidenc
e"
Pass
EmotionSense: Comparative Analysis
Feature EmotionSense Traditional Systems
Real-time Detection Yes Limited
Accuracy ~70-75% (FER2013) Lower
Conclusion and Future Enhancements
Conclusion
Successfully detects emotions
in real time.
Enhance Emotions
Add disgust and
contempt recognition.
Improve Accuracy
Use advanced models
(ResNet, EfficientNet).
Multimodal Detection
Integrate audio input.
Deployment
Deploy as a web or mobile
app.

EmotionSense-Real-Time-Emotion-Detection-System.pptx

  • 1.
    EmotionSense: Real-Time Emotion Detection System EmotionSenseis a cutting-edge system designed to detect human emotions in real-time. By leveraging machine learning and computer vision, it interprets emotional states from facial expressions. This interactive system aims to understand and react to human emotions dynamically. by Vanshree Awasthi VA
  • 2.
    Core Technologies Behind EmotionSense Python Theprimary programming language for development. OpenCV Handles image and video processing tasks. TensorFlow/Keras Powers the CNN model for emotion recognition. FER-2013 Dataset Used for training the emotion detection model.
  • 3.
    EmotionSense Key Features Real-TimeDetection Processes webcam feed in real time. Seven Emotion Classes Trained on FER2013 dataset. Lightweight Model Ensures fast inference and efficient processing. Craphical Interface Provides user-friendly interaction.
  • 4.
    Objectives of EmotionSense 1 Accurate Detection Detectfacial expressions accurately. 2 Efficient CNN Model Create a lightweight CNN model. 3 Emotional Feedback Provide feedback based on visual input. 4 Enhanced Interaction Improve user interaction using emotion awareness.
  • 5.
    Scope and Applications Mental HealthMonitoring Aids early detection of emotional distress. Education Enhances understanding of student engagement. Customer Service Improves customer experience and satisfaction. Human-Computer Interaction Creates more responsive systems.
  • 6.
    Project Flow: HowIt Works 1 Input Image or video input is received. 2 Preprocessing Input data undergoes preprocessing. 3 Emotion Detection Model CNN model analyzes facial features. 4 Emotion Prediction Emotion is predicted based on the model. 5 Output Results are displayed via GUI or logs.
  • 7.
    EmotionSense: ER Diagram EntityAttributes User user_id, name, age, gender Session session_id, user_id, timestamp Emotion_Record record_id, session_id, detected_emotion, confidence_score
  • 8.
    EmotionSense: Test Cases TestCase Input Expected Output Actual Output Status TC01 Happy face image "Happy" "Happy" Pass TC02 Angry face image "Angry" "Angry" Pass TC03 Neutral face "Neutral" "Neutral" Pass TC04 Blurred image "Error/Lo w Confidenc e" "Low Confidenc e" Pass
  • 9.
    EmotionSense: Comparative Analysis FeatureEmotionSense Traditional Systems Real-time Detection Yes Limited Accuracy ~70-75% (FER2013) Lower
  • 10.
    Conclusion and FutureEnhancements Conclusion Successfully detects emotions in real time. Enhance Emotions Add disgust and contempt recognition. Improve Accuracy Use advanced models (ResNet, EfficientNet). Multimodal Detection Integrate audio input. Deployment Deploy as a web or mobile app.