This document discusses a research paper on using emotional intelligence for human-computer interaction. It begins with an introduction describing how emotions play an important role in areas like learning, memory, and decision making. The paper then focuses on current methods for detecting user emotions, including facial expressions, voice analysis, and physiological signals. It presents three test cases analyzing voice parameters like pitch and volume for normal, angry, and panicked emotional states. Features are extracted from speech samples and classifiers are used to associate features with emotions. The goal is to develop systems that can accurately interpret a user's affective feedback through emotional state detection.