This document outlines a study that aims to develop systems that can automatically detect students' affective states during learning to improve the learning experience. The proposed system uses multiple sensors and tools - including a neuroheadset, eye tracker, facial expression recognition software, and skin conductance sensor - to collect data on engagement, emotions, attention, and arousal. The real-time data from these devices will be analyzed to understand the user experience and provide customized feedback or instruction to students without human aid.