This document describes an auditory brain-computer interface for navigation using event-related potential classification. The system uses an auditory oddball paradigm with 5 speakers around the user to provide navigation commands. Experimental conditions included discrete and continuous navigation modes. For discrete navigation, a 5x5 classification buffer and linear discriminant analysis were used to classify commands for individual movements. For continuous navigation, a penalized and sigmoid-transformed classification buffer along with matrix multiplication were used to map classifications to continuous movement directions. Performance results showed the discrete condition had more optimal trajectories while the continuous condition was found more enjoyable by subjects.