Automated analysis of emotions and affects is an inherently challenging problem with a broad range of applications in Human-Computer Interaction (HCI), health informatics, assistive technologies and multimedia retrieval. Understanding human's specific and basic emotions and reacting accordingly can improve HCI. Besides, giving machines skills to understand human's emotions when interacting with other humans can help humans with a socio-affective intelligence. In this talk, I will present deep neural networks-based approaches to analyse speech and face images to analyse human emotions and mental states.