The document summarizes a wearable, real-time mobile gaze tracker called iShadow. It uses an eye-facing and world-facing camera along with an artificial neural network trained on data from 10 subjects to predict gaze. Key results found it had around 3 degrees of error. It aims to be a low-power, real-time gaze tracker but has limitations from user variability, glasses placement, and distinguishing gaze versus head motion.