Abstract—The development of Microsoft Kinect opened up the research field of computational emotions to a wide range of applications, such as learning environments, which are excellent candidates to trial computational emotions based algorithms but were never feasible for given consumer technologies. Whilst Kinect is accessible and affordable technology it comes with its’ own additional challenges such as the limited number of extracted Action Units (AUs).
This paper presents a new approach that attempts at finding patterns of interaction between AUs and each other on one hand and patterns that link the related AUs to a given emotion. In doing so, this paper presents the ground work necessary to reach a model for dynamically generating personified set of rules relating AUs and emotions implicitly encoding a person individuality in expressing emotions.
Index Terms – computational psychoanalysis, emotion modelling, user-centred emotion detection, sentiment analysis, Kinect, personified adaptive interfaces