This research studied the emotional impacts of digital media by collecting biometric data from 20 participants playing Guitar Hero. Sensors measured electroencephalography (EEG), eye tracking, galvanic skin response, head motion and facial expressions. The data was analyzed using structural equation modeling and network analysis to identify patterns in physical, emotional and cognitive attributes during easy and hard songs. Near real-time displays and time slice analyses provided operational models of brain states during tasks. The findings suggest these methods could be used to personalize media experiences, study emotion and learning, and improve affective tutoring systems.